To provide communications, programs, services, and activities of interest to TI and former TIers around the world.

Around TI

Subscribe to Around TI feed
The company blog of Texas Instruments.
Updated: 1 hour 56 min ago

Fueling the next generation of advanced driver assistance systems

Tue, 02/11/2020 - 2:00am


Automated parking. Automatic emergency braking. Adaptive cruise control. Driver assistance features once reserved for luxury vehicles are expanding to more mainstream vehicles to bring next-level autonomy and advanced driver assistance systems (ADAS) to your daily driver.

As new models grow smarter – learning, connecting, communicating, monitoring, making decisions, entertaining and, of course, helping you drive – vehicle complexity and the computing power required to process the enormous amounts of data that make these advanced features possible has skyrocketed.

“The road to better ADAS, and eventually autonomy, has turned cars into innovation hubs and put them at the forefront of technological advances,” said Curt Moore, who leads our TI Jacinto processors business.

Learn more about the Jacinto 7 processor platform.

To fuel the next generation of autonomy, our company announced the new low-power, high-performance Jacinto™ 7 processor platform that will allow automobile designers and manufacturers to create better ADAS technology and automotive gateway systems that act as communication hubs. The first two devices in the Jacinto 7 processor platform aim to improve awareness of the car’s surroundings and accelerate the data sharing in the software-defined car – all enabled by a single software platform that developers can use to scale their software investment across multiple vehicle designs.

“We harnessed more than two decades of automotive and functional safety expertise to develop processors with enhanced deep learning capabilities and advanced networking to solve design challenges in ADAS and automotive gateway applications,” Curt said. “These innovations will provide a flexible platform to support the needs of a manufacturer’s vehicle lineup, from high-end luxury cars to the rest of their fleet.”

Accelerating the data highway

Three trends are influencing the evolution of modern vehicles:

  • Improving ADAS technology and migrating to higher levels of automated driving
  • Enhancing the connection to the cloud to enable over-the-air updates, emergency calling and more
  • Vehicle electrification to reduce emissions, enable higher efficiency and power advanced electronics

Each of these trends requires enormous amounts of data that need to be processed and communicated in real time, securely and safely. Improving ADAS and vehicle automation requires a combination of cameras, radar and possibly LIDAR technology within systems to quickly adapt to the world around them. Communicating data inside and outside the vehicle requires a substantial increase in data processing. Managing and connecting the influx of data inside and outside the car is also critical to enable vehicle electrification.

And features that are growing in popularity – such as car-sharing, fleet management and tracking, car dealers monitoring vehicle health remotely to schedule preventive maintenance, and data collection for improving ADAS – all require a connection to the internet and the cloud. Over-the-air updates will enable users to do everything from updating critical software fixes to refreshing entertainment content on the go.

“The influx of information coming into the car underscores the need for processors or systems-on-chip to quickly and efficiently manage multilevel processing in real time, all while operating within the system’s power budget,” Curt said.

For more information, learn how we’re making ADAS technology more accessible in vehicles

How intelligent, automated robots on wheels are changing last-mile delivery

Tue, 01/28/2020 - 7:00am

Early last year, students at George Mason University were joined by 25 somewhat unusual, new residents. Measuring just under 2 feet tall, Starship Technologies' fleet of boxy wheeled robots were on campus to deliver anything from coffee to sushi.

English major Kendal Denny immediately placed an order through Starship's app, which is paired with the university's meal plan.

“They were this new technology that no one on campus had ever experienced before," she said.

George Mason's executive director of campus retail operations, Mark Kraner, had been struggling with competition from other food delivery services – but managing the university's own human delivery force didn't seem viable.

“It's difficult to make sure you have the right number of people in the right places at the right times,” he said.

Rolling around at 4 mph, typically delivering orders within 15-30 minutes, Starship’s robots have quickly adapted to the campus, and the students have adapted to them, too.


Read our white paper: How sensor data is powering AI in robotics.

“I used it a lot during exam periods when you don't have time to go to the dining hall and stand in the line," said recent graduate Sofya Vetrova. “It's much easier to just order from your phone. You get notified when the delivery is downstairs, so it’s very convenient and less time-consuming."

Tens of thousands of food orders have been delivered so far across campuses nationwide, including at The University of Texas at Dallas. At George Mason, Mark is looking into expanding the service to deliver mail, groceries and bookstore orders.

"Cars are really difficult on campus because parking spaces are rare," he said. “But the robots don't need them, and they can weave easily around students, so they're just like anyone else walking along a sidewalk."

The challenge of the last mile

For George Mason students, the robots simply represent convenient food delivery, but automated delivery could mean much more on a global scale.

According to the Logistics Research Centre of Heriot‐Watt University, the last mile -- the final stage of delivery from a transportation hub to the customer's home -- contributes an average of 181 grams of CO2 into the air per delivery.1  And, with the majority of deliveries taking place in highly populated urban areas, congestion is a major concern. A combination of increasing urbanization alongside the growth of e-commerce is only increasing the problem, as urban freight looks set to increase by 40% by 2050.2

“The last mile of delivery is responsible for many of the problems we see with trucks polluting the air and blocking traffic lanes," said Matt Chevrier, a robotics expert with our company. "If we could replace these with smaller robots, which contribute significantly less pollution to the streets and can insert themselves into ordinary traffic, it could have a significant impact on urban air quality and on urban quality of life in general."


Robots in the wild

Unlocking the potential of automated delivery is not without its challenges. The first wave of robotics unfolded in factories and laboratories, taking the form of fixed robotic arms that precisely repeat pre-programmed routines, safely inside fenced-off zones.

As robots are being released into the real world, and expected to successfully navigate both the diverse obstacles presented by the urban environment and the unpredictable behaviors of their human co-inhabitants, then they need to independently perceive, understand and learn from their surroundings.

Fundamentally, that requires a few things: precise, accurate sensors, fast connection systems analogous to the human nervous system, rapid data processing -- often enabled by artificial intelligence – and quick reactions. Starship seems to be achieving all of these feats.

Multiple sensing technologies are used by robots depending on their size and speed. Some use LIDAR, ultrasonic, cameras, radar or a combination of these technologies. LIDAR is often used for autonomous vehicles with high speeds requiring long breaking distances. Starship doesn’t use LIDAR and relies on other sensor fusion for navigation and obstacle detection.

Our company’s TI mmWave sensors operate at a wavelength smaller than typical radio waves, but greater than lasers. This allows the sensor to see in challenging environmental conditions – such as darkness, extreme bright light, dust, rain, snow and extreme temperatures.

TI mmWave sensors also enable accurate detection of transparent objects, such as glass. “TI mmWave brings a lot of advantages new opportunities by ensuring that if something needs to be detected, it can reliably be detected," Matt said.

Intelligent Robots

Detection is only half the story, however. Wheeled robots also need to identify what's in front of them and then judge how best to respond. In dynamic environments, it isn't feasible to await a decision while the raw data is sent to the cloud for processing, which means the machine learning algorithms need to run on the robot itself.

Our company's Sitara™ processors are specifically optimized for the low-power operation of machine learning in the robot itself, enabling mmWave sensor data to be utilized for accurate categorization in real time. For the longer term, this data can also be uploaded to stationary computer systems, where time-constraints and power demands aren't an issue, and used to further train identification algorithms, while also building up a detailed map of the robot's typical routes.

“We've all had situations where the GPS fails on us, or doesn't give us an accurate enough location," Matt said. “Supplementing this with the robot's own map can make navigation much more reliable."

Back at George Mason, the robots have been quickly accepted as part of the student community. “People like to take pictures with them and just watch them, because they are cute," Kendal Denny said. “They're kind of our new mascot."

  1. Logistics Reearch Centre, Heriot-Watt University.
  2. Supply Chain Dive.

Engineering hope for rare diseases

Tue, 11/26/2019 - 7:00am

Gina and Joseph Hann

Gina Hann walked into her son’s kindergarten class wearing zebra stripes and armed with enough black-and-white snack cakes to build a small fort.

It was national Rare Disease Day, and Gina, an engineer at our company, was on a mission to talk to 6-year-olds about rare diseases. She explained why a zebra – a term in medicine that indicates an unlikely diagnostic possibility – is the perfect mascot for children like her son, Joseph, who was diagnosed with a terminal and degenerative brain disease in 2017.

“When you ask small children what makes a rare-disease child different from them, they give you beautifully candid answers,” Gina said in a blog post. “Things like: ‘He has a wheelchair. He drools. His words sound different. His eyes don’t work.’ Those statements hold no judgment, just honesty.”

As the mother of that rare child, Gina took the opportunity to turn her son’s challenges into a celebration of differences.

“When you ask small children what makes a rare-disease child the same as them, they celebrate the discovery,” she said. “He loves music, and instantly they sing with him. He loves laughter, and they all laugh with him from their bellies. I think kindergartners should rule the world.”

For Gina, good days like these make the difficult fight worth it. Since learning about her son’s disease, Batten, she has been on a nonstop journey to bring hope to other families dealing with a similar diagnosis and to make gene therapy for rare diseases more accessible. Initially, she and her husband, Matt, were told to make end-of-life plans for Joseph since no clinical trial or funded research existed to find a treatment.

“We didn’t know it then, but Joseph’s story was just beginning,” she said.


Learn more about how Gina and Matt helped bring hope to their son and countless others.

‘A relentless process’

Gina and Matt were told that if they wanted a clinical trial for their son, they would need to fund the work themselves, and that would mean raising a million dollars or more just to cover the first steps before a clinical trial could be developed.

Ever the engineers and problem-solvers, Gina and Matt searched for a solution. They founded Joseph’s Foundation for Batten Hope, a nonprofit organization dedicated to raising funds toward a clinical trial for a potential cure and gene therapy work at The Children’s Medical Center of Dallas and University of Texas Southwestern Medical Center.

“Matt and I challenge ourselves to innovate and work for better outcomes by the nature of our jobs and our work environment at TI,” Gina said. “We knew to ask if there could be more out there, to question everything and to always look for what’s next. That’s why we felt compelled to choose the path we did – it was a relentless process, but our work helped to make us uniquely suited for it.”

Gina and Matt have helped raise more than $1.5 million and have located over 20 other families around the world who are in need of treatment for their loved ones. Today, they’re working toward funding the final materials needed for the trial, which they hope will begin in early 2020.

“We don’t know if we’ll have the treatment in time to save Joseph’s life,” Gina said. “But no matter what, Joseph has inspired the work that can save the lives of countless others.”

Commitment to the rare disease community

Gina’s commitment to fighting rare diseases is expanding even further: She has joined efforts with other rare-disease family foundations in the Dallas area to establish a new nonprofit called RARE Dallas, which is focused on connecting and empowering affected families and finding cures. As a result of her outstanding commitment to the community, she was recently recognized with our company’s TI Founders Community Impact Award, which honors our company’s founders and their long history of philanthropy and volunteerism in communities where we live and work.

“I think for a lot of parents, when the doctor tells them to make end-of-life plans for their child with a rare disease, they are so overwhelmed that they don’t stop and to challenge if there could be other options,” Gina said. “We want to make it well known that just because a treatment hasn’t been developed yet, that doesn’t mean it can’t be done.”

Gina and her family are living proof that you can engineer your own hope.


“There is so much beautiful hope in this world,” Gina said. “Especially knowing that one day soon, the kindergartners will run the place.”

To stay updated on Joseph’s story, follow Batten Hope on Facebook and Instagram.

How sensor-rich smart stores will make shopping a breeze

Thu, 11/21/2019 - 3:00am


It's a Friday evening and you've decided to cook fish tacos. So you pop into your local supermarket on the way back from work. It takes several minutes just to hunt down the cilantro. Or, more accurately, to locate the empty shelf where the cilantro should have been. After another 10 minutes waiting in line for the single-staffed checkout lane, you finally get home to realize you forgot to buy tortillas. And that's how you end up spending your Friday night eating leftover casserole.

Data moves at almost the speed of light, but groceries don't. As the spread of high-speed internet renders information transmission ever-faster, the efficiency of the necessary physical transactions involved in buying and selling goods has lagged behind. That's about to change.


Read our white paper, “Enabling modern retail and logistics automation.”

“A lot of companies both big and small are working on using sensor technology and machine learning to improve the shopping experience ,” said Gustavo Martinez, a system engineer at our company. “Customers are frustrated by things like standing in a long checkout line, or finding out that they don't have the item they want, or that it's more expensive than somewhere else.”

A personal shopper in your pocket

The combination of machine learning and GPS technology already allows retailers to deliver personalized advertisements as a potential customer enters their vicinity. The next step is the use of in-store sensors, such as Bluetooth beacons, to deliver hyperlocal promotions at the level of the individual shelf.

These might trigger a custom notification on a smartphone—such as a half-price offer on vanilla wafers to the customer who will spend several minutes staring at the cookie aisle. Alternatively, replacing paper price tags with LCD displays will enable flexible offers to be displayed on the shelf itself, changing as different customers are approaching.

These smart displays can also guide a customer around a store, Gustavo said. “The store's app can plot out the most efficient route to pick up all the items in your list, and we can have the in-shelf displays light up as you approach to make it easy to locate the item that you're looking for."

(Please visit the site to view this video)

The end of the line for the checkout lane

Among the most significant changes to the in-store experience has been the rise of self-checkouts. These aren't just about saving staffing costs for stores.

“The main thing is getting rid of the need to stand in line to check out,” Gustavo said. “At least in my case, having to wait that additional 10 or 15 minutes is my least favorite part of going to a store.”

Self-checkouts aren't perfect, however. There's still a relatively laborious process for entering uncoded items, such as loose fruit, and the need for a store assistant to dart between sale points to assist with problems and age-restricted items.

“Some companies are looking into integrating cameras into the self-checkouts that can use machine vision to identify the items you're buying,” said Aldwin Delcour, a systems engineer at our company. “Instead of having to search through a whole set of menus, you can just put your apple in front of a camera and the system can automatically identify it.”

While more numerous self-checkouts haven’t eliminated the line altogether, the end of the line may be coming. At stores on the cutting edge of retail automation, customers scan their phone as they walk in, and a combination of cameras and in-shelf sensors tallies up the items put into their basket and automatically bills them when they leave.

Currently, this requires sending streams of data from potentially hundreds of thousands of stores up to the cloud for processing by machine learning algorithms.

“That's an enormous amount of data being siphoned off, which can present significant challenges,” Gustavo said. “So we're looking at how that data can be processed in the store itself to reduce that load.”

Gustavo Martinez (left) and Aldwin Delcour (right) 

TI mmWave sensors, which bounce high-frequency radio waves off an object to precisely identify its shape, size and distance, can simplify the recognition task, potentially allowing it to be performed in-store on our Sitara™ processors, specifically designed for low-power machine learning applications.

The highly-sensing store that's never out of stock

Smoothing a customer's journey through a store also includes making sure items they want are where they should be. Ubiquitous sensing will not only enable stores to track customers but also stock, ensuring that low-levels of an item can be detected instantly and supply ordered.

"A store might have a spring mechanism so that when you take an item, a new one is pushed forward," Aldwin said. “You can put a sensor in the back that detects how far it has moved, and then gives a signal to a centralized computer that the inventory is low and it might be time to order the next shipment.”

Once the inventory order is placed, the same technology that guides customers around a store can also be used to guide stock pickers around a warehouse, making the process of filling an order much faster and more efficient.

A Future Friday evening

The future of grocery shopping could look like this: It's a Friday evening, and the app for your local supermarket sends you a recipe for fish tacos. Based on your previous shopping behavior, the company's machine learning algorithms have built up a profile of you as a Mexican-food lover who enjoys Friday night cooking, and the recommendation is perfect. You click to add the ingredients to a digital shopping list and head to the store.

As you walk through the door, a notification pops up offering a map of where all your ingredients are located. The label beneath the relevant item lights up as you approach and nothing is out of stock.

Once all the items are in your basket, you walk straight out the door. No security guard chases after you. Instead, your phone delivers a receipt and informs you that all of the items have been charged to your account.

The whole process took a few minutes, and you arrive home early with a full set of fish taco ingredients. The rest is up to you.

How sensor-rich smart stores will make shopping a breeze

Thu, 11/21/2019 - 3:00am


It's a Friday evening and you've decided to cook fish tacos. So you pop into your local supermarket on the way back from work. It takes several minutes just to hunt down the cilantro. Or, more accurately, to locate the empty shelf where the cilantro should have been. After another 10 minutes waiting in line for the single-staffed checkout lane, you finally get home to realize you forgot to buy tortillas. And that's how you end up spending your Friday night eating leftover casserole.

Data moves at almost the speed of light, but groceries don't. As the spread of high-speed internet renders information transmission ever-faster, the efficiency of the necessary physical transactions involved in buying and selling goods has lagged behind. That's about to change.


Read our white paper, “Enabling modern retail and logistics automation.”

“A lot of companies both big and small are working on using sensor technology and machine learning to improve the shopping experience ,” said Gustavo Martinez, a system engineer at our company. “Customers are frustrated by things like standing in a long checkout line, or finding out that they don't have the item they want, or that it's more expensive than somewhere else.”

A personal shopper in your pocket

The combination of machine learning and GPS technology already allows retailers to deliver personalized advertisements as a potential customer enters their vicinity. The next step is the use of in-store sensors, such as Bluetooth beacons, to deliver hyperlocal promotions at the level of the individual shelf.

These might trigger a custom notification on a smartphone—such as a half-price offer on vanilla wafers to the customer who will spend several minutes staring at the cookie aisle. Alternatively, replacing paper price tags with LCD displays will enable flexible offers to be displayed on the shelf itself, changing as different customers are approaching.

These smart displays can also guide a customer around a store, Gustavo said. “The store's app can plot out the most efficient route to pick up all the items in your list, and we can have the in-shelf displays light up as you approach to make it easy to locate the item that you're looking for."

(Please visit the site to view this video)

The end of the line for the checkout lane

Among the most significant changes to the in-store experience has been the rise of self-checkouts. These aren't just about saving staffing costs for stores.

“The main thing is getting rid of the need to stand in line to check out,” Gustavo said. “At least in my case, having to wait that additional 10 or 15 minutes is my least favorite part of going to a store.”

Self-checkouts aren't perfect, however. There's still a relatively laborious process for entering uncoded items, such as loose fruit, and the need for a store assistant to dart between sale points to assist with problems and age-restricted items.

“Some companies are looking into integrating cameras into the self-checkouts that can use machine vision to identify the items you're buying,” said Aldwin Delcour, a systems engineer at our company. “Instead of having to search through a whole set of menus, you can just put your apple in front of a camera and the system can automatically identify it.”

While more numerous self-checkouts haven’t eliminated the line altogether, the end of the line may be coming. At stores on the cutting edge of retail automation, customers scan their phone as they walk in, and a combination of cameras and in-shelf sensors tallies up the items put into their basket and automatically bills them when they leave.

Currently, this requires sending streams of data from potentially hundreds of thousands of stores up to the cloud for processing by machine learning algorithms.

“That's an enormous amount of data being siphoned off, which can present significant challenges,” Gustavo said. “So we're looking at how that data can be processed in the store itself to reduce that load.”

Gustavo Martinez (left) and Aldwin Delcour (right) 

TI mmWave sensors, which bounce high-frequency radio waves off an object to precisely identify its shape, size and distance, can simplify the recognition task, potentially allowing it to be performed in-store on our Sitara™ processors, specifically designed for low-power machine learning applications.

The highly-sensing store that's never out of stock

Smoothing a customer's journey through a store also includes making sure items they want are where they should be. Ubiquitous sensing will not only enable stores to track customers but also stock, ensuring that low-levels of an item can be detected instantly and supply ordered.

"A store might have a spring mechanism so that when you take an item, a new one is pushed forward," Aldwin said. “You can put a sensor in the back that detects how far it has moved, and then gives a signal to a centralized computer that the inventory is low and it might be time to order the next shipment.”

Once the inventory order is placed, the same technology that guides customers around a store can also be used to guide stock pickers around a warehouse, making the process of filling an order much faster and more efficient.

A Future Friday evening

The future of grocery shopping could look like this: It's a Friday evening, and the app for your local supermarket sends you a recipe for fish tacos. Based on your previous shopping behavior, the company's machine learning algorithms have built up a profile of you as a Mexican-food lover who enjoys Friday night cooking, and the recommendation is perfect. You click to add the ingredients to a digital shopping list and head to the store.

As you walk through the door, a notification pops up offering a map of where all your ingredients are located. The label beneath the relevant item lights up as you approach and nothing is out of stock.

Once all the items are in your basket, you walk straight out the door. No security guard chases after you. Instead, your phone delivers a receipt and informs you that all of the items have been charged to your account.

The whole process took a few minutes, and you arrive home early with a full set of fish taco ingredients. The rest is up to you.

Why the future of automation is being propelled by innovation at the edge

Tue, 11/19/2019 - 4:00am

Sameer Wasson discusses the future of automation and intelligence at the edge.


The future of intelligent machines rests on innovation at the edge – the embedded technology that enables real-time sensing and processing for more dynamic decision-making.

Automation that used to be preprogrammed and structured has evolved so that machines now understand in real time what’s happening in their environment and can react to it intelligently, safely, securely and autonomously. The technology that enables this is machine learning – a subset of artificial intelligence – and it’s transforming machines that were once line cooks into chefs. But they’re not quite master chefs.

As signal processing technology has evolved and added more machine learning features along the way, we have opened the door for advances in vehicle occupancy detection, intuitive human-machine interaction and more without needing to rely on cloud processing every time.


Learn how we’re bringing the next evolution of machine learning to the edge.


For example, edge intelligence in your future vehicle will be able to sense an object nearby and classify it as a pedestrian. The machine learns from this experience in real time and evaluates data, such as response time between object detection to vehicle action, to improve over time.

And when you park it in the garage that evening, it connects to the cloud and shares that knowledge with the entire connected fleet.


Now take that technology into a field of corn, where planters are programmed to sow seeds about every six inches. Since the ground can be inconsistent, sometimes seeds don’t do well – they might need to be planted deeper or spaced farther apart. Embedded intelligence enables the planter to analyze the soil for moisture, nutrients and other data before a seed is planted. It can predict how many seeds will successfully mature, and the data can be uploaded to the cloud so that farmers can forecast yields.

Or imagine your future shopping experience: in stores that are on the cutting edge of retail automation, customers scan their phone as they walk in. A combination of cameras and in-shelf sensors tally up the items put into their basket, automatically billing customers when they leave.

Currently, this requires sending streams of data from potentially hundreds of thousands of stores up to the cloud for processing by machine learning algorithms. That's an enormous amount of data, which can present significant challenges. With TI mmWave sensors and processors – highly intelligent sensors that integrate precise, real-time decision-making and processing on a single chip – that data can be processed in the store itself to reduce that load.

Eventually, the boundary between the edge and the cloud will start to get very interesting. How rapidly the technology can prioritize which data to send to the cloud quickly, repeatedly and consistently – and receive actionable information back – will be the next problem to solve.

As we find solutions at the edge for automation, our everyday machines will continue to make our lives more convenient, efficient and safer.

Sameer Wasson is vice president and general manager of our Processors business unit.

Why the future of automation is being propelled by innovation at the edge

Tue, 11/19/2019 - 4:00am

Sameer Wasson discusses the future of automation and intelligence at the edge.


The future of intelligent machines rests on innovation at the edge – the embedded technology that enables real-time sensing and processing for more dynamic decision-making.

Automation that used to be preprogrammed and structured has evolved so that machines now understand in real time what’s happening in their environment and can react to it intelligently, safely, securely and autonomously. The technology that enables this is machine learning – a subset of artificial intelligence – and it’s transforming machines that were once line cooks into chefs. But they’re not quite master chefs.

As signal processing technology has evolved and added more machine learning features along the way, we have opened the door for advances in vehicle occupancy detection, intuitive human-machine interaction and more without needing to rely on cloud processing every time.


Learn how we’re bringing the next evolution of machine learning to the edge.


For example, edge intelligence in your future vehicle will be able to sense an object nearby and classify it as a pedestrian. The machine learns from this experience in real time and evaluates data, such as response time between object detection to vehicle action, to improve over time.

And when you park it in the garage that evening, it connects to the cloud and shares that knowledge with the entire connected fleet.


Now take that technology into a field of corn, where planters are programmed to sow seeds about every six inches. Since the ground can be inconsistent, sometimes seeds don’t do well – they might need to be planted deeper or spaced farther apart. Embedded intelligence enables the planter to analyze the soil for moisture, nutrients and other data before a seed is planted. It can predict how many seeds will successfully mature, and the data can be uploaded to the cloud so that farmers can forecast yields.

Or imagine your future shopping experience: in stores that are on the cutting edge of retail automation, customers scan their phone as they walk in. A combination of cameras and in-shelf sensors tally up the items put into their basket, automatically billing customers when they leave.

Currently, this requires sending streams of data from potentially hundreds of thousands of stores up to the cloud for processing by machine learning algorithms. That's an enormous amount of data, which can present significant challenges. With TI mmWave sensors and processors – highly intelligent sensors that integrate precise, real-time decision-making and processing on a single chip – that data can be processed in the store itself to reduce that load.

Eventually, the boundary between the edge and the cloud will start to get very interesting. How rapidly the technology can prioritize which data to send to the cloud quickly, repeatedly and consistently – and receive actionable information back – will be the next problem to solve.

As we find solutions at the edge for automation, our everyday machines will continue to make our lives more convenient, efficient and safer.

Sameer Wasson is vice president and general manager of our Processors business unit.