Tony Fadell: The Nest Thermostat Disrupted My Life - IEEE Spectrum

2022-06-18 23:52:41 By : Ms. Melody Song

IEEE websites place cookies on your device to give you the best user experience. By using our websites, you agree to the placement of these cookies. To learn more, read our Privacy Policy.

The Nest founder tells of years in pursuit of a thermostat he actually likes

Tony Fadell shows off the Nest thermostat in 2012.

The thermostat chased me for 10 years.

That is pretty extreme, by the way. If you’ve got an idea for a business or a new product, you usually don’t have to wait a decade to make sure it’s worth doing.

For most of the 10 years that I idly thought about thermostats, I had no intention of building one. It was the early 2000s, and I was at Apple making the first iPhone. I got married, had kids. I was busy.

But then again, I was also really cold. Bone-chillingly cold.

Every time my wife and I drove up to our Lake Tahoe ski cabin on Friday nights after work, we’d have to keep our snow jackets on until the next day. The house took all night to heat up.

Adapted from the book BUILD: An Unorthodox Guide to Making Things Worth Making by Tony Fadell. Copyright 2022 by Tony Fadell. Reprinted by permission of Harper Business, an imprint of HarperCollins Publishers.

Walking into that frigid house drove me nuts. It was mind-boggling that there wasn’t a way to warm it up before we got there. I spent dozens of hours and thousands of dollars trying to hack security and computer equipment tied to an analog phone so I could fire up the thermostat remotely. Half my vacations were spent elbow-deep in wiring, electronics littering the floor. But nothing worked. So the first night of every trip was always the same: We’d huddle on the ice block of a bed, under the freezing sheets, watching our breath turn into fog until the house finally warmed up by morning.

Then on Monday I’d go back to Apple and work on the first iPhone. Eventually I realized I was making a perfect remote control for a thermostat. If I could just connect the HVAC system to my iPhone, I could control it from anywhere. But the technology that I needed to make it happen—reliable low-cost communications, cheap screens and processors—didn’t exist yet.

How did these ugly, piece-of-crap thermostats cost almost as much as Apple’s most cutting-edge technology?

A year later we decided to build a new, superefficient house in Tahoe. During the day I’d work on the iPhone, then I’d come home and pore over specs for our house, choosing finishes and materials and solar panels and, eventually, tackling the HVAC system. And once again, the thermostat came to haunt me. All the top-of-the-line thermostats were hideous beige boxes with bizarrely confusing user interfaces. None of them saved energy. None could be controlled remotely. And they cost around US $400. The iPhone, meanwhile, was selling for $499.

How did these ugly, piece-of-crap thermostats cost almost as much as Apple’s most cutting-edge technology?

The architects and engineers on the Tahoe project heard me complaining over and over about how insane it was. I told them, “One day, I’m going to fix this—mark my words!” They all rolled their eyes—there goes Tony complaining again!

At first they were just idle words born of frustration. But then things started to change. The success of the iPhone drove down costs for the sophisticated components I couldn’t get my hands on earlier. Suddenly high-quality connectors and screens and processors were being manufactured by the millions, cheaply, and could be repurposed for other technology.

My life was changing, too. I quit Apple and began traveling the world with my family. A startup was not the plan. The plan was a break. A long one.

We traveled all over the globe and worked hard not to think about work. But no matter where we went, we could not escape one thing: the goddamn thermostat. The infuriating, inaccurate, energy-hogging, thoughtlessly stupid, impossible-to-program, always-too-hot-or-too-cold-in-some-part-of-the-house thermostat.

Someone needed to fix it. And eventually I realized that someone was going to be me.

This 2010 prototype of the Nest thermostat wasn’t pretty. But making the thermometer beautiful would be the easy part. The circuit board diagrams point to the next step—making it round.Tom Crabtree

The big companies weren’t going to do it. Honeywell and the other white-box competitors hadn’t truly innovated in 30 years. It was a dead, unloved market with less than $1 billion in total annual sales in the United States.

The only thing missing was the will to take the plunge. I wasn’t ready to carry another startup on my back. Not then. Not alone.

Then, magically, Matt Rogers, who’d been one of the first interns on the iPod project, reached out to me. He was a real partner who could share the load. So I let the idea catch me. I came back to Silicon Valley and got to work. I researched the technology, then the opportunity, the business, the competition, the people, the financing, the history.

Making it beautiful wasn’t going to be hard. Gorgeous hardware, an intuitive interface—that we could do. We’d honed those skills at Apple. But to make this product successful—and meaningful—we needed to solve two big problems:

It needed to save energy.

And we needed to sell it.

In North America and Europe, thermostats control half a home’s energy bill—something like $2,500 a year. Every previous attempt to reduce that number—by thermostat manufacturers, by energy companies, by government bodies—had failed miserably for a host of different reasons. We had to do it for real, while keeping it dead simple for customers.

Then we needed to sell it. Almost all thermostats at that point were sold and installed by professional HVAC technicians. We were never going to break into that old boys’ club. We had to find a way into people’s minds first, then their homes. And we had to make our thermostat so easy to install that literally anyone could do it themselves.

It took around 9 to 12 months of making prototypes and interactive models, building bits of software, talking to users and experts, and testing it with friends before Matt and I decided to pitch investors.

Once we had prototypes of the thermostat, we sent it out to real people to test.

It was fatter than we wanted. The screen wasn’t quite what I imagined. Kind of like the first iPod, actually. But it worked. It connected to your phone. It learned what temperatures you liked. It turned itself down when nobody was home. It saved energy. We knew self-installation was potentially a huge stumbling block, so everyone waited with bated breath to see how it went. Did people shock themselves? Start a fire? Abandon the project halfway through because it was too complicated? Soon our testers reported in: Installation went fine. People loved it. But it took about an hour to install. Crap. An hour was way too long. This needed to be an easy DIY project, a quick upgrade.

So we dug into the reports—what was taking so long? What were we missing?

Our testers...spent the first 30 minutes looking for tools.

Turns out we weren’t missing anything—but our testers were. They spent the first 30 minutes looking for tools—the wire stripper, the flathead screwdriver; no, wait, we need a Phillips. Where did I put that?

Once they gathered everything they needed, the rest of the installation flew by. Twenty, 30 minutes tops.

I suspect most companies would have sighed with relief. The actual installation took 20 minutes, so that’s what they’d tell customers. Great. Problem solved.

But this was going to be the first moment people interacted with our device. Their first experience of Nest. They were buying a $249 thermostat—they were expecting a different kind of experience. And we needed to exceed their expectations. Every minute from opening the box to reading the instructions to getting it on their wall to turning on the heat for the first time had to be incredibly smooth. A buttery, warm, joyful experience.

And we knew Beth. Beth was one of two potential customers we defined. The other customer was into technology, loved his iPhone, was always looking for cool new gadgets. Beth was the decider—she dictated what made it into the house and what got returned. She loved beautiful things, too, but was skeptical of supernew, untested technology. Searching for a screwdriver in the kitchen drawer and then the toolbox in the garage would not make her feel warm and buttery. She would be rolling her eyes. She would be frustrated and annoyed.

Shipping the Nest thermostat with a screwdriver "turned a moment of frustration into a moment of delight"Dwight Eschliman

So we changed the prototype. Not the thermostat prototype—the installation prototype. We added one new element: a little screwdriver. It had four different head options, and it fit in the palm of your hand. It was sleek and cute. Most importantly, it was unbelievably handy.

So now, instead of rummaging through toolboxes and cupboards, trying to find the right tool to pry their old thermostat off the wall, customers simply reached into the Nest box and took out exactly what they needed. It turned a moment of frustration into a moment of delight.

Sony laughed at the iPod. Nokia laughed at the iPhone. Honeywell laughed at the Nest Learning Thermostat.

In the stages of grief, this is what we call Denial.

But soon, as your disruptive product, process, or business model begins to gain steam with customers, your competitors will start to get worried. And when they realize you might steal their market share, they’ll get pissed. Really pissed. When people hit the Anger stage of grief, they lash out, they undercut your pricing, try to embarrass you with advertising, use negative press to undermine you, put in new agreements with sales channels to lock you out of the market.

And they might sue you.

The good news is that a lawsuit means you’ve officially arrived. We had a party the day Honeywell sued Nest. We were thrilled. That ridiculous lawsuit meant we were a real threat and they knew it. So we brought out the champagne. That’s right, f---ers. We’re coming for your lunch.

With every generation, the product became sleeker, slimmer, and less expensive to build. In 2014, Google bought Nest for $3.2 billion. In 2016 Google decided to sell Nest, so I left the company. Months after I left, Google changed its mind. Today, Google Nest is alive and well, and they’re still making new products, creating new experiences, delivering on their version of our vision. I deeply, genuinely, wish them well.

Tony Fadell started his thirty-plus-year Silicon Valley career at General Magic, went on to make the iPod and IPhone, and started Nest. He now leads investment and advisory firm Future Shape.

Not a very accurate narrative. Retrofit programmable thermostats were sold at big box hardware stores long before the Nest was introduced. I replaced my old dumb thermostat with one several houses ago. He could easily have put in a thermostat programmed to have his cabin warm when he arrived for the weekend. It wouldn't have all the bells and whistles of a Nest, but he wouldn't have been freezing for the first day of the weekend. And of course suitable connectors, processors, and the like for a thermostat were available long before the iPhone was a thing.

I was an early adopter of smart thermostats. My dirt-simple Honeywell gold plastic circular thing with a bit of mercury on a bimetal coil was taken out of service and I replaced it with one of those programmable thermostats and I spent an evening reprogramming it. I presented my efforts to my wife and she couldn't make heads or tails of what she had to do to warm things up or cool things off. A twist of the dial was replaced by a flurry of button presses and a hard-to-read screen. And then the batteries died just after I forgot how to reprogram the thing and/or lost the booklet. We went through three or four such thermostats. My wife, strangely, did not divorce me. When the Nest came out, we had been burned several times, it wasn't fun, and we both longed for the simple Honeywell gold plastic circular thing. Nevertheless, I installed the Nest, programmed it, and showed my wife how to set the temperature.

She cried tears of joy. I felt like a skunk for the prior decade and a half of unnecessary vexation I caused her.

It's not enough to introduce the technology, but to make sure it's sophisticated enough to be simple to operate.

The University of Texas professor co-invented discrete cosine transform

Jae Jeong Hwang is a professor of IT convergence and communication engineering at Kunsan National University, in Korea.

Zoran M. Milicevic is an assistant professor of telecommunications and IT at the University of Belgrade, in Serbia.

Zoran S. Bojković is a professor of electrical engineering at the University of Belgrade.

Kamisetty Ramamohan “K.R.” Rao died on 15 January 2021 at the age of 89. He co-invented the discrete cosine transform (DCT) technique, which is widely used in digital signal processing and data compression.

Rao was a professor of electrical engineering at the University of Texas at Arlington for more than 50 years.

This tribute is an excerpted version of an article dedicated to his memory written by three of his colleagues: IEEE Member Jae Jeong Hwang, Zoran M. Milicevic, and IEEE Life Senior Member Zoran S. Bojković. Hwang is a professor of IT convergence and communication engineering at Kunsan National University, in Korea; Miicevic is an assistant professor of telecommunications and IT at the University of Belgrade, in Serbia; and Bojković is a professor of electrical engineering, also at the University of Belgrade.

Rao received a bachelor’s degree in electrical engineering in 1952 from the College of Engineering, Guindy, in Chennai, India. He then moved to the United States and earned two master’s degrees from the University of Florida, in Gainesville: one in EE in 1959 and the other in nuclear engineering in 1960. He received a Ph.D. in 1966 in EE from the University of New Mexico, in Albuquerque.

After graduating, he joined UT Arlington as a research professor. He was promoted to associate professor three years later and became a full professor in 1973.

In the early 1970s, he began working with Nasir Ahmed, professor emeritus of electrical and computer engineering at the University of New Mexico, to develop DCT. They presented their results in the article “Discrete Cosine Transform,” published in the January 1974 IEEE Transactions on Computers.

Similar to the discrete Fourier transform, the DCT converts a signal or image from the spatial domain (a matrix of pixels) to the frequency domain (in which images are represented by mathematical functions).

DCT technology reduces the amount of data required to display, store, and transmit images by identifying parts of the image that contain significant amounts of energy—the ones that are most important to retaining image quality.

Originally proposed as an image-compression technique, DCT is now an industry standard in image and video coding, commonly used to store and transmit JPEG images as well as MPEG video files. DCT also has applications in digital video and television, speech coding, satellite imaging, signal processing, and telecommunications.

Rao went on to develop four different types of the technology: DCT-I, DCT-II (used in image and video compression including high-definition television), DCT-III, and DCT-IV (which has applications in audio coding algorithms).

“HDTV would not have been possible without the research accomplished by K.R. Rao and his students and collaborators,” said Venkat Devarajan, a former Rao student who is now an EE professor at UT Arlington.

Rao co-authored 22 books, some of which have been translated from English to Chinese, Japanese, Korean, Russian, and Spanish. He also published papers on Walsh functions and a variety of other topics related to image and signal processing.

During his tenure at UT Arlington, he advised more than 100 graduate students. He was a member of the university’s Academy of Distinguished Scholars.

He was a visiting professor at universities in Australia, India, Japan, Korea, Singapore, and Thailand. He also conducted workshops and tutorials on video and audio coding and standards around the world.

“Everyone speaks about him in the highest regard—not just as a scholar but as a mentor, a friend, a person who helped them, and a person who encouraged them,” said Vistasp Karbhezi, an engineering professor and former president of UT Arlington. “I think that’s his legacy.”

A small band of believers triumphed after years of quietly plugging away

Rodney Brooks is the Panasonic Professor of Robotics (emeritus) at MIT, where he was director of the AI Lab and then CSAIL. He has been cofounder of iRobot, Rethink Robotics, and Robust AI, where he is currently CTO.

In 1997, Harvard Business School professor Clayton Christensen created a sensation among venture capitalists and entrepreneurs with his book The Innovator's Dilemma. The lesson that most people remember from it is that a well-run business can’t afford to switch to a new approach—one that ultimately will replace its current business model—until it is too late.

One of the most famous examples of this conundrum involved photography. The large, very profitable companies that made film for cameras knew in the mid-1990s that digital photography would be the future, but there was never really a good time for them to make the switch. At almost any point they would have lost money. So what happened, of course, was that they were displaced by new companies making digital cameras. (Yes, Fujifilm did survive, but the transition was not pretty, and it involved an improbable series of events, machinations, and radical changes.)

A second lesson from Christensen’s book is less well remembered but is an integral part of the story. The new companies springing up might get by for years with a disastrously less capable technology. Some of them, nevertheless, survive by finding a new niche they can fill that the incumbents cannot. That is where they quietly grow their capabilities.

For example, the early digital cameras had much lower resolution than film cameras, but they were also much smaller. I used to carry one on my key chain in my pocket and take photos of the participants in every meeting I had. The resolution was way too low to record stunning vacation vistas, but it was good enough to augment my poor memory for faces.

This lesson also applies to research. A great example of an underperforming new approach was the second wave of neural networks during the 1980s and 1990s that would eventually revolutionize artificial intelligence starting around 2010.

Neural networks of various sorts had been studied as mechanisms for machine learning since the early 1950s, but they weren’t very good at learning interesting things.

In 1979, Kunihiko Fukushima first published his research on something he called shift-invariant neural networks, which enabled his self-organizing networks to learn to classify handwritten digits wherever they were in an image. Then, in the 1980s, a technique called backpropagation was rediscovered; it allowed for a form of supervised learning in which the network was told what the right answer should be. In 1989, Yann LeCun combined backpropagation with Fuksuhima's ideas into something that has come to be known as convolutional neural networks (CNNs). LeCun, too, concentrated on images of handwritten digits.

In 2012, the poor cousin of computer vision triumphed, and it completely changed the field of AI.

Over the next 10 years, the U.S. National Institute of Standards and Technology (NIST) came up with a database, which was modified by LeCun, consisting of 60,000 training digits and 10,000 test digits. This standard test database, called MNIST, allowed researchers to precisely measure and compare the effectiveness of different improvements to CNNs. There was a lot of progress, but CNNs were no match for the entrenched AI methods in computer vision when applied to arbitrary images generated by early self-driving cars or industrial robots.

But during the 2000s, more and more learning techniques and algorithmic improvements were added to CNNs, leading to what is now known as deep learning. In 2012, suddenly, and seemingly out of nowhere, deep learning outperformed the standard computer vision algorithms in a set of test images of objects, known as ImageNet. The poor cousin of computer vision triumphed, and it completely changed the field of AI.

A small number of people had labored for decades and surprised everyone. Congratulations to all of them, both well known and not so well known.

But beware. The message of Christensen’s book is that such disruptions never stop. Those standing tall today will be surprised by new methods that they have not begun to consider. There are small groups of renegades trying all sorts of new things, and some of them, too, are willing to labor quietly and against all odds for decades. One of those groups will someday surprise us all.

I love this aspect of technological and scientific disruption. It is what makes us humans great. And dangerous.

This article appears in the July 2022 print issue as “The Other Side of The Innovator’s Dilemma.”