The internet of things (IoT) is changing the way we interact with the world around us. Everyone — and everything — is connected and soon will be interconnected. Microelectromechanical systems (MEMS) devices and sensors assume an essential job in collecting, monitoring, and analyzing data, often in real time.
In an interview with EE Times Europe , Peter Hartwell, CTO of TDK InvenSense, laid out a future in which IoT technologies transcend the individual experience and become invisible. It’s a future in which sensors are the glue between the virtual and the real worlds.
EE Times Europe: [Last year], you were inducted into the SEMI-MEMS & Sensors Industry Group Hall of Fame, a distinction that acknowledges your substantial and lasting impact on this industry. How do you feel about it?
Peter Hartwell: If I had to summarize in one word how I feel: old. I still feel that MEMS is a new industry. I have been doing this literally since high school; I have not deviated from the path. I have watched a lot of people go off and do other things. It is a huge honor to be recognized by your peers for [your] contributions [in] moving the technology forward. So I would say old and honored.
EETE: You have more than 25 years of experience in commercializing silicon MEMS and working on advanced sensors and actuators. You also have over 40 worldwide patents on MEMS and sensor applications. What are the main achievements or determining decisions in your career?
Hartwell: For me, it has been looking at how sensors are going to be the driver for change. It was a realization I came to early. My first MEMS job was at Hewlett-Packard, and from there, it was, “What is a computer company going to do with sensors?” We were building these brains, and they were blind, deaf, and numb to what was going on in the world. Sensors gave them the ability to see the world and eventually to interact with the world. It seems so natural to us now, and we have the compute power behind that to interpret commands as simple as changing the music. We are realizing that sensors are going to be the interface between the digital world and the real world.
My second job was at Apple. There, I was able to see the impact of making technology accessible to the masses and not having something that was just sort of esoteric, just for the geeks and just for the early adopters to really see.
One of my favorite memories was literally in a restaurant [where] I watched a grandfather and his granddaughter looking at something on an iPhone. You literally had a 70-something and a four-something looking at something on an iPhone, and a 70-something and a four-something using a computer, and we had enabled that. [Just] four years before, neither of them would have been computer users. [Now], not only were they using a computer, but they were having such a good experience that the mum took a picture of them.
The technology transcends and becomes transparent. So when I look at where I am trying to go, moving forward, it’s … how we make that technology just disappear into [the environment] around us to where we are no longer surprised that technology worked or did something. It just becomes natural.
Coming back to the voice interface concept, the computer keyboard is 140 years old, if you go back to the time that typewriters were invented. My son, who is eight, looks at a keyboard [and] sees only one button. It is the Siri button or the Ok Google button. [His expectation is,] “I am just going to push it and talk to it, and it just works.” With that expectation, what is it going to look like when the technologies just disappear into the background and we are more efficient and have better and safer experiences?
EETE: In 2017, TDK made its sensor ambitions clear when it acquired InvenSense and its strong software team, well-versed in AI, predictive control, and analysis of movement. How has InvenSense technology been leveraged in TDK’s smartphone and IoT businesses?
Hartwell: Hopefully, there will be a beneficial relationship.
We were a young fabless company. We very much understood the sensors and the importance of the software. TDK is a materials and manufacturing company, driven fundamentally by quality. There is this Japanese word we use, monozukuri, which means “if you want to build something, build it well.” We now talk about the concept of kotozukuri, which is the idea that if you want to build something, build it well but build it with purpose, try to do something for the customer, understand the customer’s needs. It allows us to look across the whole vertical, from the raw materials that go into building something, which is fundamental to the quality and performance, all the way up to how that experience is going to affect our customers and our customers’ customers.
It has been a great mix of two different competencies and strengths. Together, we are much more than [we were] when apart because we get that system level now across all sectors, whether it is in IoT, in automotive, or in consumer electronics. We have a better toolset to attack the whole market.
EETE: Looking ahead, what are the next steps in TDK’s InvenSense expansion in terms of product development and technology roadmap?
Hartwell: We are doing a lot of work right now on ultrasonics. Before the acquisition, inside InvenSense, we had a very nice ultrasonic fingerprint sensor, acquired by TDK [when it purchased InvenSense]. We have [since] acquired another company, Chirp. It is a great fit inside TDK because of TDK’s knowledge in ultrasound materials. [The Chirp business] is one of the very big business units inside TDK. So it is just the perfect marriage of new materials, new MEMS devices coming together, and I think that is where we are seeing the application space explode. We heard at [last year’s MEMS & Sensors Executive Congress] conference that electrostatic actuators have gone far, and now the piezoelectric stuff is maybe going to take us to the next step.
For me, personally, I like the expansion because we are looking not just at sensors now, but we are [also] looking at actuators, [with the] Chirp device putting out a sound and then looking at the echo coming back. It’s both the sensors and actuators.
Back to my original comment on sensors making computers smarter in sensing what’s going on, if you can manipulate what’s going on, that’s really the next step toward actuation. For us, ultrasound is a first step toward actuation, toward smart systems being able to influence stuff in the physical world. That’s the direction we are looking at right now.
EETE: As a CTO, you are probably keeping an eye on what’s emerging from research labs. Which interesting MEMS and sensor technologies, excluding TDK’s, have you seen recently?
Hartwell: To me, one of the places is medtech — not necessarily the bioMEMS to replace laboratory equipment, but more mass market, with things we will be able to do at home or with a wearable device. That’s what I am keeping an eye on. It is logical to say that somewhere in 20 to 40 years from now, you are going to be wearing [a continuous] diagnostic monitoring system. The question is how you get there, what steps we need to take to get us there.
The other one is the explosion of optical sensors. We are seeing this in LiDAR, in structured-light 3D imaging. What is neat about these technologies is that a lot has been developed for automotive. The reason we all have accelerometers in our phones now is because they started out for automotive.
You are going to start having toys with radar. I can’t wait to see what the toy guys think of when suddenly you can put a radar in a toy robot. I can’t wait to see where optical 3D sensing, radar, and LiDAR are going to go, especially in the toy space.
EETE: At the MEMS & Sensors Executive Congress, the Technology Showcase amplified awareness of the latest MEMS and sensor technologies and applications, including a DNA search engine and a 4D LiDAR for autonomous vehicles, as well as wearable biosensors for health care. Any takeaways on the technology you would like to share with us? Has one of the startups caught your attention?
Hartwell: You get to see people who believe in an idea so much, they are going to risk their ability to put food on the table to pursue that idea. I am very much into AR and VR right now. I believe the way we are going to record experiences and share them with people is really what is going to bring VR into the non-gamers’ hands. Gaming will always be there, but that is not what is going to drive the market. I am trying to see how we can solve that problem … Where are the sensors? How are we going to get to content creation for VR? That’s where we are stuck right now. You can buy a headset, but you don’t know what to watch. It’s like the days of black-and-white TV. If you go back to the early TV shows, [they were] vaudeville, because that was entertainment. It took 70 years to come up with “Game of Thrones.”
So the question is how we accelerate [so that] VR becomes a platform to consume and experience what we could not do before. I think it’s going to be travel; it’s going to be shopping [and] personal content. I have been able to digitize myself skiing and [then] put my dad skiing with me and his grandson in VR. He took the headsets off, looked at me, and went, “I never thought I would get to go skiing with my grandson.” [He didn’t think,] “That was a cool picture you showed me,” [but,] “I went skiing with my grandson.” That is really how he felt.
EETE: A sense of mission and purpose filters through your words, as you seem to be willing to add some humanity to everything you are developing.
Hartwell: It should be natural for us to do that. We do form relationships with things that aren’t alive. As they begin to talk and react to us, it is only natural for us to make those experiences with those devices, [with] something we can relate to. To me, sensors are the glue between the virtual world and the real world, so we are trying to give robots sensors to put them into our world, which means they have to see, and they have to hear, to feel and smell. And we are trying to put people in a virtual world, which means you have to create the content. Sensors are the glue there.
The next step is when smart speakers follow you around the house. If the Sony Aibo was able to do all the things that Alexa could do, suddenly you would have the perfect pet, a companion. It would follow you, charge itself. It would change the music, take care of the lights, and if you fall down, it would come over and see if you are OK and call for help. That’s what we are going to have, and I can’t wait.
Autonomous companions, that’s how I would like to [characterize] them. And the technology is going to disappear.
>> This article was originally published on our sister site, EE Times Europe.