Martin Cooper

The Presidential Spectrum Prize

A proposal for delivering affordable, broadband mobile-Internet access to all Americans

presidential spectrum prizeAmerica is currently in imminent danger of losing its leadership position in telecommunications. At the same time, businesses in industries and countries around the globe face a new productivity constraint-the rapidly growing demand for low cost mobile bandwidth.  There is no viable long-range plan that will support this demand. The Presidential Prize will solve these challenges at minimal public cost and risk.  The prize will be a segment of radio-frequency spectrum awarded to a U.S. entity that creates and implements a technology capable of achieving 100 times higher spectral efficiency than exist today at half today’s cost for consumers.

Congress and the Executive Branch control and manage the radio-frequency spectrum in the United States. But that spectrum belongs to the country’s citizens. Only visionary leadership, of the kind that put a man on the moon can bring the United States-and the world-into the new era of productivity that low-cost and widely available mobile access promises. Consider just a few facts:

Mobile traffic is growing rapidly: As Cisco recently reported, global mobile-data traffic grew 70% in 2012 to 885 monthly petabytes[1]. This year, meanwhile, the world will grow to have more mobile, Internet-connected devices than people. Further, Cisco predicts, global mobile data traffic will increase 13-fold between 2012 and 2017, growing at a compound annual growth rate of 66% and reaching 11.2 monthly exabytes of traffic by 2017.

Mobile access is too expensive: At the same time that demand necessitates a more efficient use of existing radio spectrum for mobile Internet access, access to mobile data is far too expensive. A 2012 survey indicates that nearly half of U.S. wireless consumers pay $100 or more monthly for mobile service, while 21% pay more for their monthly mobile service than for their groceries[2]. Is it any wonder that only about a third of the world’s population has access to the Internet[3]-mobile or otherwise?

The potential benefits of mobile access are enormous: The potential impact of sufficient and affordable mobile access to the Internet on healthcare, education, employment rates and national productivity cannot be overestimated. But congressional and FCC policies and regulations that have served the nation so well in other matters are not likely to crack the code on making the vital national resource of radio-frequency spectrum widely available. The well-meaning administration initiative to redistribute spectrum currently allocated to government use may, over an extended period, result in a roughly 20% increase in bandwidth available for public data communications. This will hardly suffice for the 13-fold increase in demand expected in fewer than five years.

U.S. Influence is diminishing:  At the same time, nearly all manufacturing of mobile-telecommunications equipment has moved offshore. There are no major infrastructure companies remaining in the U.S. Most mobile phones are built in China and Taiwan while China’s Huawei is the world’s fastest growing telecommunications-infrastructure company. Sweden’s Ericsson, France’s Alcatel/Lucent, Finland’s Nokia and Huawei, build almost all mobile infrastructure equipment. Although Google, Apple, U.S. carriers and others are important members of standards bodies, their influence is far reduced from historical levels.

The Presidential Prize

At no cost to U.S. taxpayers, the Presidential Spectrum Prize is meant to boost competition, alleviate unemployment, reinvigorate U.S. telecommunications and drive the creation and adoption of inexpensive and spectrally efficient technology for a vast majority of the American public.

To win the competition, the entrant must demonstrate a hundred-times improvement in spectral efficiency and a per-bit price at least half that of existing offerings. A maximum of two such awards will be made. The process of selecting the winner(s) will involve four phases, as follows:

Phase one – The publication of technical papers revealing the theoretical proof of the proposed technology, showing how both the technical and economic objectives will be met. Estimated cost per entrant: $200,000 to $1M.

Phase two – Laboratory demonstrations and measurements showing that the proposed solution can meet the Prize’s objectives. Estimated cost per entrant: $2M to $10M.

Phase three – A small-scale test bed comprising at least 30 sectors or cells in an urban or urban-like area with sufficient user devices to demonstrate feasibility.  Estimated cost per system: $10M to $30M.                                                                                                  

Phase four – Full city deployment in a medium-sized urban area, probably by one or two remaining participants.   Estimated cost for initial deployment: $50M to $250M.

Total estimated cost for nationwide deployment: $20B+.

With Congressional approval, the FCC will make one or two awards to the competition winner(s) of a nationwide license for approximately 20 MHz of bandwidth of radio-frequency spectrum. While this appears too small to be commercially viable, the awardee’s technology will expand its capacity to equal many times the entire existing cellular spectrum.

Contestants will bear all research-and-development costs. Volunteer judges will be objective and skilled engineers and economists drawn from the National Academies and academia. Foundations and other non-associated sources will supply funds necessary to cover administrative and technical costs.

This proposal is presented in outline form. While the dollar amounts are estimates, the objective of 100-times efficiency is absolute and must not be compromised.  Unlike the unsuccessful and so-called pioneer’s preference awards of the past, the actual award of spectrum will occur only after proof-of-concept, and substantial investment, by the winner(s).


[1] http://www.cisco.com/en/US/solutions/collateral/ns341/ns525/ns537/ns705/ns827/white_paper_c11-520862.html

[2] http://www.prnewswire.com/news-releases/one-in-five-mobile-phone-owners-pay-more-per-month-for-their-phone-service-than-groceries-reveals-new-couponcabin-survey-169435766.html

[3] http://newsroom.fb.com/News/690/Technology-Leaders-Launch-Partnership-to-Make-Internet-Access-Available-to-All

Wireless Internet Only for the Elite?

wireless only for the eliteWireless Internet is a necessity for those who can afford it and yet the high cost of access is an insurmountable barrier for most people. More efficient use of the radio spectrum using existing technology can solve this problem and yet, our regulatory policies do little to encourage efficiency. The office of the President and the FCC should do more to motivate carriers and manufacturers to develop and deploy spectrally efficient technology like multi-antenna systems (smart antennas) and microcells. Redistribution of existing radio spectrum does little to solve the real problem and will only delay its ultimate solution.

The most important benefit of wireless communication is improving productivity; increasing the wealth that makes people live better, function better. This has been proven repeatedly since the Handy Talkie in World War Two enabled soldiers on the battlefield to work as teams. We have reached the point where many businesses cannot function profitably without some form of wireless communications. But, the most vivid examples of wireless productivity improvement are in the poorest places in the world where farmers use the village cell phone to optimize the price they can get for their crops, where fishermen let each other know where the fish are, and where job openings are broadcast and filled in real time.

Voice communication is availabe at reasonable prices in most places. This is not true of wireless data. Despite the ability of wireless access to improve their lives, such access is beyond the reach of most people in the world because of its high price.

This is simply wrong! For the past 110 years, technology has allowed us to reduce the price of wireless by a factor of two every 3 1/2 years. And yet, for the last several years, the price of wireless data is actually increasing.

The overt reason given for high data prices is the shortage of spectrum. We’ve never had a scarcity of spectrum in the past because technology has alway kept up with the increasing needs and most often stayed ahead of these needs. Have as exhausted the ability of technology to continue multiplying the available throughput of spectrum adn thereby reducing prices? Not by a long shot. So what’s the problem?

The easiest way for wireless carriers to increase their capacity is to acquire more spectrum. Of course, we know that there isn’t any more spectrum so the only way to acquire more is to take it away from someone else. The office of the President has directed the Department of Commerce to redistribute 500 MHz of radio frequency spectrum from existing services, ostensibly to resolve “shortages” in other services. This is a well-intentioned effort but, in the long run, almost irrelevant. There have been numerous predictions of what the future needs for data throughput will be. Cisco has predicted that, within the next five years, we will need 20 to 40 times more throughput than exist today. It’s difficult to understand how an incremental increase of 10 or 20 or even 30% will do anything to fulfill a need of 20 to 40 times.

The only solution to the challenge of making wireless data available to the people who need it most is the adoption of spectrally efficient technology. That technology exists, has been proven, and actually makes our wireless systems more cost-effective. Wireless carriers do not own the radio spectrum. We, the public do. The government must adopt policies that motivate carriers to accelerate the adoption of technologies that can make wireless access to the Internet available to everyone, not just those who can afford today’s higher costs.

Revolution in Education

I first met Richard Miller, David Kerns, and Sherra Kerns, when they received the Gordon prize of the National Academy of Engineering in 2013. They are the founding team of the Franklin W. Olin College of Engineering, a private undergraduate engineering college located in Needham, Massachusetts. This is one of thousands of institutions that are on the threshold of revolutionizing our educational system.

Sherra Kerns, Richard Miller, David Kerns
Sherra Kerns, Richard Miller, David Kerns

David Kerns quickly got to the heart of what is happening when he told me that the very concept of the academic lecture is disappearing. He pointed out to me that students attending a lecture can now search the available knowledge on the lecturer’s subject faster than he or she can deliver it. There continues to be important roles for classrooms in schools, but delivering information is not one of them. Olin College and others are adopting what I call the inverted classroom. Here is my imperfect view of what education will look like a generation or two from now:

Hundreds of millions of young people (defined as everyone who is still educable) are spending many billions of dollars buying and playing extraordinarily complex games. At some point, somebody is going to figure out that we can devise games that teach people, challenge them, measure their progress, and at the same time, entertain them. The curriculum then becomes the process of selecting from a diversity of appropriate games. Tests that measure progress are actually embedded within the games. It would be difficult to cheat since the “student” is involved in a virtual reality and is actually living the game. In this model, education can happen anywhere and all the time (assuming that inexpensive wireless interconnection is available everywhere, all the time).

What then is the role of the schools? Olin College offers an excellent example. The school creates the curriculum. Attendance at school offers an opportunity to exercise the knowledge that is received by the students’ interactive learning. The role of the teacher is to participate with the students, to pass on wisdom and experience, to stimulate collaboration, and to guide and channel the learning process. This is not just a theoretical new teaching system; we’ve had lots of those and most have failed. This is acceptance of a truly more efficient way to learn by students who have already adopted the new technology that is the basis of the new learning process.

Progressive schools throughout the country are already experimenting with these new ideas. The call it the “flipped classroom.” My granddaughter, Tracy, is a teacher in such a school. Tracy and her colleagues are trying to devise methods by which their students can do much of their learning at home and much of their practice at school. Computers are central to the concept. The focus is not yet on wirelessly connected devices, but they are creating an excellent start. The next step, however, will require some revolutionary thinking. Teachers have to be skilled at teaching but they are not necessarily skilled at creating the teaching tools. To make the revolution work, we are going to have to divert a significant part of those billions of dollars we now spend on games, as well as the skills of those who have become expert at creating the entertainment and diversions that are the essence of the success of the games industry.

We’re also going to have to re-create the role of the teacher. I suspect that this is going to be the hardest problem. I fear that we are going to have to wait for teachers, like Tracy, to move into leadership roles before the revolution really gains momentum.

The learning revolution doesn’t stop at school. The collaboration tools that are the essence of the inverted classroom are the same tools that will revolutionize the business enterprise. More on that in a future blog.

Common Sense at the FCC

I was fortunate enough to hear an amazing speech by FCC Commissioner Jessica Rosenworcel at the Wi-Fi Forward meeting only a few months ago. It was unusual for a commissioner to so concisely explain, in common sense terms, how the wide variety of wireless communications users creates a need for different technologies and different allocations as the FCC manages the radio frequency spectrum. The Commissioner pointed out, among other things, that the various spectrum candidates for new allocation of unlicensed spectrum have varying needs and purposes, and many different applications. She also made it clear that there is a role, in spectrum allocation, for both licensed and unlicensed use of the radio frequency spectrum.

I strongly support these views as well as other points that she made. Opening up spectrum for unlicensed use will offer an opportunity for enhanced public value, but only if there is innovation that increases capacity of wireless spectrum and lowers the cost to the public of deployment and service costs.

The 802.11 standard has served us well, but it was not designed for large numbers of contending users. It is very inefficient in the way that it uses spectrum. It makes up for some of this inefficiency by limiting power output so that the spectrum is reused geographically, but the collision avoidance approach used in 802.11 is terribly inefficient. When even relatively small numbers of users try to operate in one band, an increasing number of collisions occur that require an increasing number of re-transmissions. This chews up unacceptable amounts of spectrum.

802.11 can be fixed, and in a backward-compatible way. There are techniques available TODAY that could use any new spectrum in a far more effective and that could be brought to market in a time frame compatible with the FCC’s regulatory efforts. These techniques include, but are not limited to:

1) More organized sharing and handling of collision avoidance,

2) smart antenna techniques – spatial processing and interference mitigation – that increase capacity and reduce cost, and

3) meshed networking techniques that, while somewhat reducing spectral efficiency, reduce backhaul costs.

Further, there is no one system solution that is optimum for every class of user. Given the right opportunity and stimulus for innovation, another powerful form of efficiency is achieved by virtue of matching categories of user requirements to technical attributes like range, building penetration, reliability, coverage, and speed, as well as social priority of different categories. It may be appropriate to allocate separate bands of spectrum for different applications and standards.

What would I do if I was the FCC?

I would make new spectrum allocations contingent upon industry proposals for system approaches that are responsive to the burgeoning need for more bandwidth at lower cost. An allocation that uses spectrum in the same old way is wrong. Spectrum efficient technology is continually advancing and the FCC, as the manager of the publically owned spectrum, should require periodic adoption of such advancements.

I would task industry advisory groups to work with the FCC to establish technical standards for spectral efficiency and perhaps even for user cost.

I would encourage trials, perhaps even competitive ones, that offer real-world proof of performance of system candidates.

I would reach out to impartial organizations like the National Academies, the IEEE, and the WWRF to create a periodically updated National Spectrum Technology Roadmap, to provide decision makers at the FCC and Congress with a multigenerational baseline for their decision making and stimulus of innovation.

I would undertake the enormous task of using the National Roadmap to solve the looming problem of effectuating the availability of very low cost bandwidth for essential services.

I look forward to seeing what dynamic and visionary leaders like Jessica Rosenworcel will do at the FCC to execute on some of my suggestions. After all, these suggestions are just common sense.

 

 

Wireless Healthcare: A Revolution

Wireless Healthcare: A RevolutionCan you imagine a society in which diseases, and the pain and suffering they cause, simply do not exist, in which people are healthy until end-of-life?  Such a society is within the realm of scientific possibility, but it won’t happen with the existing approaches to health care. Our health care system is broken. Its problems are far more profound and difficult than the short-term financial and social problems that Obamacare promises to fix.  Of course, we have to get to universal health care in some fashion, but we have to get there in a way that is affordable and that is sustainable. The way we go about that today is not viable.

U. S. expenditure on health care is approaching 20% of gross national product and, if something isn’t done, it will REACH 30% of the GNP within 10 years. We can’t afford that. Despite this escalating and out-of-control spending, people are not living more healthfully. As quickly as we cure one disease, or its symptoms, a new one erupts. The drugs we produce help some people, but are ineffective or harmful to others. We are forced to create new antibiotics as bacteria become resistant to the old ones.  A large percentage of our health care expenditure is spent to extend the lives of our elderly maintaining unacceptable life styles.

What’s the problem?

The approach of the medical establishment to health care is to cure disease, and yet the potential exists, given the appropriate focus and funding, to eliminate most diseases. I didn’t say cure, I said eliminate. I have been told by responsible authorities that every disease is actionably preventable. What a powerful statement this is.

From the time of birth, the human body is a mess. Our bodies, even the healthy ones, are loaded with viruses, bad bacteria, toxins, and mutated or otherwise corrupted cells. Similarly our environment is filled with undesirable pollutants. When our immune system and other repair-or-replace mechanisms are working, they keep these baddies in check. We say, in this case, that our bodies are “healthy”. When we exhibit any of a myriad of combinations of symptoms, we describe our bodies as “diseased”. Modern medicine, for the most part, treats these symptoms. But neither of these characterizations, “healthy” or “diseased,” is absolute. A person may be infected for days with a rhino-virus, for example, before feeling an itchy throat and starting to sneeze – typical symptoms of the onset of a common cold. Cancer cells can multiply in an otherwise healthy body for years before showing any overt evidence of their existence. The threshold between healthy and diseased doesn’t really exist; it’s a continuum.

To further complicate matters, the effort of the repair systems in the body to work on a specific escalating issue like an airborne virus that is breathed in, dilutes the ability of these mechanisms to manage other issues that are otherwise benign. The immune system, for example, becomes less effective at keeping these other issues in check because it has been “overloaded”.

We try to detect these nascent diseases and stop them by having periodic medical examinations. Insofar as these exams discipline people to see a doctor who use the exams to establish a baseline picture of their health, they can be useful. However, for the purpose of anticipating diseases, early enough to stop them, they are almost worthless. Why?

We humans come in all shapes, sizes, and varieties. A low pulse rate can be a sign of excellent health for one person and a danger signal for another. A particular blood pressure can be perfectly normal for one individual and life-threatening for another. Our DNA programs us differently as to our susceptibility to the undesirable elements in our bodies and in the environment. Modern medicine tends to define normal ranges that are too wide to be useful in describing the early onset of disease. To do this properly, there has to be a detailed picture of what is “normal” for each individual.

Suppose it was possible do a complete physical examination on a person every minute. Suppose the sensors that measure a person’s vital signs were sensitive enough to detect the onset of diabetes, or a few cancer cells, or the beginnings of inflammation, before any part of that person’s body was affected. What if the data from that exam could be correlated with the person’s history, genome, and a database comprising the histories of hundreds of millions of other people?  If we could do these things, the treatment of these vestigial diseases could be targeted with precision, and far less invasive, than if the disease was allowed to progress to the point that the person was aware of it. Consider the onset of a cancer. Detection is the hard part. Once we know where the few cancer cells are, it’s often possible to zap them with a laser beam without affecting surrounding tissue. We pretty much know how to do the latter; the hard part is the detection.

This isn’t science fiction, it’s science. The key technologies are in detection and connection. While today’s science is about six orders of magnitude away from the ability to detect a few cancer cells, scientists are zeroing in on diseases like congestive heart failure, obesity and diabetes.  Incubators at PhiloMetron and West Wireless, companies like Sotera Wireless, and hospitals like Palomar Medical Center are creating and testing practical devices stimulated by medical visionaries like Dr. Ben Kanter at Palomar. Qualcomm is conducting an X Prize competition, the winner of which will create a Star Trek like tri-corder. Deloitte predicts the U.S. market for devices like these will grow to $22 billion by 2015.

Wireless connection is available now, although cost of service will become an issue.

Wireless medical health is destined to become the next technology revolution.

It’s Time for Cell Phones to Get Really Smart

The competition in smart phones has become frenetic. Samsung has gesture control, but it doesn’t work very well. The IPhone has Siri, but that feature has turned out to be just a toy. Now HTC has a new phone that is “gorgeous.” Still struggling to stay in the game, Motorola RAZR Max has the longest battery life. The science behind all of these phones is magnificent, but do they represent good technology? I don’t think so!

Technology is the application of science to create goods and services that make people’s lives better. The best technology is invisible; you don’t even know it’s there but it improves your life in some meaningful way. Almost as good is transparent technology; you engage it when you need it but don’t really have to think about it. Acceptable technology is intuitive; the things you do to engage it are natural and unintrusive and you don’t need an instruction manual to make it work. Smart cell phones have turned those of us who really use them into engineers. We are diverted from the useful things that we’re trying to do into the distraction of understanding the cell phone.

I don’t fault the manufacturers for creating this wonderful hardware, but now that we have it, it’s time for them to focus on making our lives better. There is already some interesting progress and we are going to see an explosion of usefulness in the next few years. I count as “progress” elimination of the charging plug (Nokia), simplified voice control of navigation (Google), simple rules-based triggers (Smart Actions by Motorola), and smart cameras (Blink by Microsoft), but these are only the start.

I’m hoping and counting on Motorola to shake things up this Summer by offering features on a smart phone that improve our lives without intruding.

Scroll to Top