To provide communications, programs, services, and activities of interest to TI and former TIers around the world.

Around TI

Subscribe to Around TI feed
The company blog of Texas Instruments.
Updated: 32 min 12 sec ago

Engineering hope for rare diseases

Tue, 11/26/2019 - 7:00am

Gina and Joseph Hann

Gina Hann walked into her son’s kindergarten class wearing zebra stripes and armed with enough black-and-white snack cakes to build a small fort.

It was national Rare Disease Day, and Gina, an engineer at our company, was on a mission to talk to 6-year-olds about rare diseases. She explained why a zebra – a term in medicine that indicates an unlikely diagnostic possibility – is the perfect mascot for children like her son, Joseph, who was diagnosed with a terminal and degenerative brain disease in 2017.

“When you ask small children what makes a rare-disease child different from them, they give you beautifully candid answers,” Gina said in a blog post. “Things like: ‘He has a wheelchair. He drools. His words sound different. His eyes don’t work.’ Those statements hold no judgment, just honesty.”

As the mother of that rare child, Gina took the opportunity to turn her son’s challenges into a celebration of differences.

“When you ask small children what makes a rare-disease child the same as them, they celebrate the discovery,” she said. “He loves music, and instantly they sing with him. He loves laughter, and they all laugh with him from their bellies. I think kindergartners should rule the world.”

For Gina, good days like these make the difficult fight worth it. Since learning about her son’s disease, Batten, she has been on a nonstop journey to bring hope to other families dealing with a similar diagnosis and to make gene therapy for rare diseases more accessible. Initially, she and her husband, Matt, were told to make end-of-life plans for Joseph since no clinical trial or funded research existed to find a treatment.

“We didn’t know it then, but Joseph’s story was just beginning,” she said.


Learn more about how Gina and Matt helped bring hope to their son and countless others.

‘A relentless process’

Gina and Matt were told that if they wanted a clinical trial for their son, they would need to fund the work themselves, and that would mean raising a million dollars or more just to cover the first steps before a clinical trial could be developed.

Ever the engineers and problem-solvers, Gina and Matt searched for a solution. They founded Joseph’s Foundation for Batten Hope, a nonprofit organization dedicated to raising funds toward a clinical trial for a potential cure and gene therapy work at The Children’s Medical Center of Dallas and University of Texas Southwestern Medical Center.

“Matt and I challenge ourselves to innovate and work for better outcomes by the nature of our jobs and our work environment at TI,” Gina said. “We knew to ask if there could be more out there, to question everything and to always look for what’s next. That’s why we felt compelled to choose the path we did – it was a relentless process, but our work helped to make us uniquely suited for it.”

Gina and Matt have helped raise more than $1.5 million and have located over 20 other families around the world who are in need of treatment for their loved ones. Today, they’re working toward funding the final materials needed for the trial, which they hope will begin in early 2020.

“We don’t know if we’ll have the treatment in time to save Joseph’s life,” Gina said. “But no matter what, Joseph has inspired the work that can save the lives of countless others.”

Commitment to the rare disease community

Gina’s commitment to fighting rare diseases is expanding even further: She has joined efforts with other rare-disease family foundations in the Dallas area to establish a new nonprofit called RARE Dallas, which is focused on connecting and empowering affected families and finding cures. As a result of her outstanding commitment to the community, she was recently recognized with our company’s TI Founders Community Impact Award, which honors our company’s founders and their long history of philanthropy and volunteerism in communities where we live and work.

“I think for a lot of parents, when the doctor tells them to make end-of-life plans for their child with a rare disease, they are so overwhelmed that they don’t stop and to challenge if there could be other options,” Gina said. “We want to make it well known that just because a treatment hasn’t been developed yet, that doesn’t mean it can’t be done.”

Gina and her family are living proof that you can engineer your own hope.


“There is so much beautiful hope in this world,” Gina said. “Especially knowing that one day soon, the kindergartners will run the place.”

To stay updated on Joseph’s story, follow Batten Hope on Facebook and Instagram.

How sensor-rich smart stores will make shopping a breeze

Thu, 11/21/2019 - 3:00am


It's a Friday evening and you've decided to cook fish tacos. So you pop into your local supermarket on the way back from work. It takes several minutes just to hunt down the cilantro. Or, more accurately, to locate the empty shelf where the cilantro should have been. After another 10 minutes waiting in line for the single-staffed checkout lane, you finally get home to realize you forgot to buy tortillas. And that's how you end up spending your Friday night eating leftover casserole.

Data moves at almost the speed of light, but groceries don't. As the spread of high-speed internet renders information transmission ever-faster, the efficiency of the necessary physical transactions involved in buying and selling goods has lagged behind. That's about to change.


Read our white paper, “Enabling modern retail and logistics automation.”

“A lot of companies both big and small are working on using sensor technology and machine learning to improve the shopping experience ,” said Gustavo Martinez, a system engineer at our company. “Customers are frustrated by things like standing in a long checkout line, or finding out that they don't have the item they want, or that it's more expensive than somewhere else.”

A personal shopper in your pocket

The combination of machine learning and GPS technology already allows retailers to deliver personalized advertisements as a potential customer enters their vicinity. The next step is the use of in-store sensors, such as Bluetooth beacons, to deliver hyperlocal promotions at the level of the individual shelf.

These might trigger a custom notification on a smartphone—such as a half-price offer on vanilla wafers to the customer who will spend several minutes staring at the cookie aisle. Alternatively, replacing paper price tags with LCD displays will enable flexible offers to be displayed on the shelf itself, changing as different customers are approaching.

These smart displays can also guide a customer around a store, Gustavo said. “The store's app can plot out the most efficient route to pick up all the items in your list, and we can have the in-shelf displays light up as you approach to make it easy to locate the item that you're looking for."

(Please visit the site to view this video)

The end of the line for the checkout lane

Among the most significant changes to the in-store experience has been the rise of self-checkouts. These aren't just about saving staffing costs for stores.

“The main thing is getting rid of the need to stand in line to check out,” Gustavo said. “At least in my case, having to wait that additional 10 or 15 minutes is my least favorite part of going to a store.”

Self-checkouts aren't perfect, however. There's still a relatively laborious process for entering uncoded items, such as loose fruit, and the need for a store assistant to dart between sale points to assist with problems and age-restricted items.

“Some companies are looking into integrating cameras into the self-checkouts that can use machine vision to identify the items you're buying,” said Aldwin Delcour, a systems engineer at our company. “Instead of having to search through a whole set of menus, you can just put your apple in front of a camera and the system can automatically identify it.”

While more numerous self-checkouts haven’t eliminated the line altogether, the end of the line may be coming. At stores on the cutting edge of retail automation, customers scan their phone as they walk in, and a combination of cameras and in-shelf sensors tallies up the items put into their basket and automatically bills them when they leave.

Currently, this requires sending streams of data from potentially hundreds of thousands of stores up to the cloud for processing by machine learning algorithms.

“That's an enormous amount of data being siphoned off, which can present significant challenges,” Gustavo said. “So we're looking at how that data can be processed in the store itself to reduce that load.”

Gustavo Martinez (left) and Aldwin Delcour (right) 

TI mmWave sensors, which bounce high-frequency radio waves off an object to precisely identify its shape, size and distance, can simplify the recognition task, potentially allowing it to be performed in-store on our Sitara™ processors, specifically designed for low-power machine learning applications.

The highly-sensing store that's never out of stock

Smoothing a customer's journey through a store also includes making sure items they want are where they should be. Ubiquitous sensing will not only enable stores to track customers but also stock, ensuring that low-levels of an item can be detected instantly and supply ordered.

"A store might have a spring mechanism so that when you take an item, a new one is pushed forward," Aldwin said. “You can put a sensor in the back that detects how far it has moved, and then gives a signal to a centralized computer that the inventory is low and it might be time to order the next shipment.”

Once the inventory order is placed, the same technology that guides customers around a store can also be used to guide stock pickers around a warehouse, making the process of filling an order much faster and more efficient.

A Future Friday evening

The future of grocery shopping could look like this: It's a Friday evening, and the app for your local supermarket sends you a recipe for fish tacos. Based on your previous shopping behavior, the company's machine learning algorithms have built up a profile of you as a Mexican-food lover who enjoys Friday night cooking, and the recommendation is perfect. You click to add the ingredients to a digital shopping list and head to the store.

As you walk through the door, a notification pops up offering a map of where all your ingredients are located. The label beneath the relevant item lights up as you approach and nothing is out of stock.

Once all the items are in your basket, you walk straight out the door. No security guard chases after you. Instead, your phone delivers a receipt and informs you that all of the items have been charged to your account.

The whole process took a few minutes, and you arrive home early with a full set of fish taco ingredients. The rest is up to you.

Why the future of automation is being propelled by innovation at the edge

Tue, 11/19/2019 - 4:00am

Sameer Wasson discusses the future of automation and intelligence at the edge.


The future of intelligent machines rests on innovation at the edge – the embedded technology that enables real-time sensing and processing for more dynamic decision-making.

Automation that used to be preprogrammed and structured has evolved so that machines now understand in real time what’s happening in their environment and can react to it intelligently, safely, securely and autonomously. The technology that enables this is machine learning – a subset of artificial intelligence – and it’s transforming machines that were once line cooks into chefs. But they’re not quite master chefs.

As signal processing technology has evolved and added more machine learning features along the way, we have opened the door for advances in vehicle occupancy detection, intuitive human-machine interaction and more without needing to rely on cloud processing every time.


Learn how we’re bringing the next evolution of machine learning to the edge.


For example, edge intelligence in your future vehicle will be able to sense an object nearby and classify it as a pedestrian. The machine learns from this experience in real time and evaluates data, such as response time between object detection to vehicle action, to improve over time.

And when you park it in the garage that evening, it connects to the cloud and shares that knowledge with the entire connected fleet.


Now take that technology into a field of corn, where planters are programmed to sow seeds about every six inches. Since the ground can be inconsistent, sometimes seeds don’t do well – they might need to be planted deeper or spaced farther apart. Embedded intelligence enables the planter to analyze the soil for moisture, nutrients and other data before a seed is planted. It can predict how many seeds will successfully mature, and the data can be uploaded to the cloud so that farmers can forecast yields.

Or imagine your future shopping experience: in stores that are on the cutting edge of retail automation, customers scan their phone as they walk in. A combination of cameras and in-shelf sensors tally up the items put into their basket, automatically billing customers when they leave.

Currently, this requires sending streams of data from potentially hundreds of thousands of stores up to the cloud for processing by machine learning algorithms. That's an enormous amount of data, which can present significant challenges. With TI mmWave sensors and processors – highly intelligent sensors that integrate precise, real-time decision-making and processing on a single chip – that data can be processed in the store itself to reduce that load.

Eventually, the boundary between the edge and the cloud will start to get very interesting. How rapidly the technology can prioritize which data to send to the cloud quickly, repeatedly and consistently – and receive actionable information back – will be the next problem to solve.

As we find solutions at the edge for automation, our everyday machines will continue to make our lives more convenient, efficient and safer.

Sameer Wasson is vice president and general manager of our Processors business unit.

Why the future of automation is being propelled by innovation at the edge

Tue, 11/19/2019 - 4:00am

Sameer Wasson discusses the future of automation and intelligence at the edge.


The future of intelligent machines rests on innovation at the edge – the embedded technology that enables real-time sensing and processing for more dynamic decision-making.

Automation that used to be preprogrammed and structured has evolved so that machines now understand in real time what’s happening in their environment and can react to it intelligently, safely, securely and autonomously. The technology that enables this is machine learning – a subset of artificial intelligence – and it’s transforming machines that were once line cooks into chefs. But they’re not quite master chefs.

As signal processing technology has evolved and added more machine learning features along the way, we have opened the door for advances in vehicle occupancy detection, intuitive human-machine interaction and more without needing to rely on cloud processing every time.


Learn how we’re bringing the next evolution of machine learning to the edge.


For example, edge intelligence in your future vehicle will be able to sense an object nearby and classify it as a pedestrian. The machine learns from this experience in real time and evaluates data, such as response time between object detection to vehicle action, to improve over time.

And when you park it in the garage that evening, it connects to the cloud and shares that knowledge with the entire connected fleet.


Now take that technology into a field of corn, where planters are programmed to sow seeds about every six inches. Since the ground can be inconsistent, sometimes seeds don’t do well – they might need to be planted deeper or spaced farther apart. Embedded intelligence enables the planter to analyze the soil for moisture, nutrients and other data before a seed is planted. It can predict how many seeds will successfully mature, and the data can be uploaded to the cloud so that farmers can forecast yields.

Or imagine your future shopping experience: in stores that are on the cutting edge of retail automation, customers scan their phone as they walk in. A combination of cameras and in-shelf sensors tally up the items put into their basket, automatically billing customers when they leave.

Currently, this requires sending streams of data from potentially hundreds of thousands of stores up to the cloud for processing by machine learning algorithms. That's an enormous amount of data, which can present significant challenges. With TI mmWave sensors and processors – highly intelligent sensors that integrate precise, real-time decision-making and processing on a single chip – that data can be processed in the store itself to reduce that load.

Eventually, the boundary between the edge and the cloud will start to get very interesting. How rapidly the technology can prioritize which data to send to the cloud quickly, repeatedly and consistently – and receive actionable information back – will be the next problem to solve.

As we find solutions at the edge for automation, our everyday machines will continue to make our lives more convenient, efficient and safer.

Sameer Wasson is vice president and general manager of our Processors business unit.

How smart, adaptive headlights will make your drive safer

Tue, 10/29/2019 - 3:00am


Cruising downhill on a nighttime motorcycle ride in Los Angeles’ downtown arts district, Robert Sabel came within seconds of losing his life.

“I was cruising through the green light well below the speed limit when a truck coming toward me turned on his high beams and blinded me," said Robert, who owns and operates a company that designs and remanufactures customized motorcycles. “I could tell I was veering close to the sidewalk."

Even worse, the oncoming truck decided at the last moment to make a left turn across Robert’s path. “I had to lay down the bike to avoid hitting the curb, sliding a good fifty feet," he said. “I was lucky to be alive."

There’s encouraging news for motorists like Robert: Smart headlights may soon brighten America’s roads and safety prospects. Using pixelated light sources, sensors, cameras and sophisticated software to direct a vehicle's high beams, the headlights provide the optimal light for every driving condition, while eliminating blinding glare.

“A new breed of automotive headlight systems featuring swiveling headlamps, high-end sensors and programmable controllers can do much more than cast bright light in front of a vehicle,” said Arun Vemuri, a general manager at our company who works with automotive body electronics and lighting. “The light is not only brighter and sharper – it’s smarter, too.”


Learn more about trends driving automotive lighting design.

Seeing is believing

Adaptive driving beams are designed to take luck out of the driving safety equation. Although adaptive headlights are available on cars in Europe, Japan and other markets, automobile manufacturers in the United States are banned from using the advanced lights. That may change soon.

The National Highway Traffic Safety Administration wants to amend current safety rules to allow the headlights, extolling their “potential to reduce the risk of crashes by increasing visibility without increasing glare."

Had the technology been on the truck that blinded Robert, he may not have had as much trouble seeing ahead. “With adaptive driving beams, certain portions of the headlight turn off when oncoming traffic is perceived by the cameras and sensors, but the remainder of the lights stay on to illuminate the surroundings," Arun said. “The lights don't just dim -- they actually throw light down and alongside roads so the driver can see and respond safely to these conditions."

Displaying danger ahead

Our company’s technology enables the development of more efficient and cost-effective lighting sources than traditional halogen and xenon light bulbs, such as energy-efficient LED headlights. Even better, the lights are directed by software to project words or symbols onto the road ahead, alerting drivers about sharp curves, flooded roadways ahead or other conditions.

“Hundreds of thousands of tiny mirrors would turn on to project the image, powered by TI DLP® technology," Arun said. The unique headlights go into production in 2020.

Other breakthroughs include headlights that automatically adjust upward when a vehicle is going up a hill and downward when driving down a hill, illuminating more of the road ahead to better anticipate a hazard. Rear lights are due for an upgrade as well.

“As we migrate to LED lights, we can do things like have the pixels swipe to the right prior to a right turn and to the left before a left turn," he said. “The lights can also be programmed to automatically display messages, such as the speed limit or an alert about an object on the road ahead. This is all about increasing the efficiency and safety of cars."

Interior illumination is also getting redesigned with driver safety in mind. Our company is providing technology that allows our customers to create lighting systems inside cars that could automatically brighten or change color when it appears the driver is becoming drowsy or distracted.

Driving with X-ray vision

Lighting features could also be used on car windows to display information to pedestrians and other drivers. For example, your name and destination might be projected onto the side window of a rideshare vehicle you requested.

In the far future, projections on car windows could give nearby drivers the ability to see what’s in front of the car ahead of them. Rear windows would stream video to communicate with the cars behind them – about pedestrians, traffic slowdowns or construction, for example – and give drivers information about what to expect so they could be ready to react or adjust their routes. And with improved, high-resolution headlights, car cameras will do a better job of capturing what’s around them.

Best of all, these enhancements will occur without driver intervention. Similar to other autonomous driving features, the lighting systems will respond to signals to make trips more efficient, comfortable and safe. Even motorcycles may soon sport adaptive driving beams.

This possibility appealed to Robert, whose company's mission is to optimize yesterday's highly engineered machinery with today's technologies to improve safety and reliability. “The truth is that today's standard beams are pretty terrible, no matter how bright they are," he said. “Improvements are long overdue."


Updated robotics kit brings technology to life for university students

Tue, 09/24/2019 - 3:30am

A summer intern at our company with her TI-RSLK MAX.

The summer internship was nearing its end and the robotics competition was approaching fast. That’s when Aaron Barrera realized that the closet doors in his apartment – off their hinges and leaning against a wall – would make a great practice maze to prepare for the competition.

So Aaron and three electrical engineering classmates from the University of Florida – all summer interns at our company – laid the doors on the floor, rolled some strips of black electrical tape into a maze, assembled the TI-RSLK MAX in less than 15 minutes and began pushing the robotics system to its limits.


Aaron Barrera (center) and other members of the team programmed their TI-RSLK MAX before the intern competition this summer.

They wanted bragging rights from doing well in the competition, of course, but also understood how the robot could help them become better engineers as they looked ahead to graduation.

“You can propel yourself as an engineer, learn something in the process, and set yourself apart as somebody who likes to solve problems and innovate,” said Sebastian Betancur, a member of the team.

Bringing technology to life

The TI Robotics System Learning Kit family – with the TI-RSLK MAX being its newest addition – is a low-cost robotics kit and curriculum for the university classroom that is simple to build, code and test with solderless assembly. The system can solve a maze, follow lines and avoid obstacles. Students can use the curriculum to learn how to integrate hardware and software knowledge to build and test a system.

Learn more about the TI-RSLK MAX

The system uses our SimpleLink™ MSP432P401R microcontroller (MCU) LaunchPad™ Development Kit, easy-to-use sensors and a chassis board that transforms the robot into a learning experience. Students can use wireless communication and Internet of Things (IoT) capabilities to control the robot remotely or enable robots to communicate with each other.

“From an academic standpoint, the topics are rich – from circuits and software to interfacing and systems and the Internet of Things,” said Jon Valvano, the University of Texas at Austin electrical and computer engineering professor who collaborated with our team to develop the TI-RSLK family. “And it’s done in a way that is fun and understandable by students. The TI-RSLK is educationally powerful.”

And its benefits extend beyond the classroom.

“In the future, students will have to self-learn and adapt,” said Ayesha Mayhugh, a university product manager at our company. “We don’t know how our jobs are going to change in the future. Having the ability to bring these complex concepts to life and be a self-learner, beyond what is taught in classrooms, will be critical for success.”

 
Jon Valvano, a University of Texas at Austin professor, collaborated with our team to develop the TI-RSLK family of educational robots.

Learning platform

The University of Florida team spent some long, pizza-fueled weekends in Aaron’s Dallas apartment – interrupted with occasional video games – to program their robot and prepare for the intern competition in late July. Competitors were judged on the speed their robots navigated the maze, the amount of power they used and innovation. The team from Florida didn’t win, but they did appreciate the new robot and the support behind it.

“It’s a really good learning platform,” said Daniel Bermudez, also a member of the team from the University of Florida. “People who work with this can see how an embedded processor can be used for fun and learning.”

“It is very well made,” team member Colin Adema said. “The documentation, the code and other support helped a lot. We could have been metaphorically and literally spinning our wheels without that support, but being able to just start it up and start implementing our own code and algorithms pushed us to keep working.”

 

From concept to cosmos: How Jack Kilby's integrated circuit transformed the electronics industry

Tue, 09/17/2019 - 10:00am


In 1958, as one of the few employees working through summer vacation at our company, electrical engineer Jack Kilby had the lab to himself. And it was during those two solitary weeks that he hit upon an insight that would transform the electronics industry.

Since 1948, transistors had begun to replace large, power-hungry vacuum tubes in electronics manufacturing, but hand-soldering thousands of these individual components onto a chip was expensive, time-consuming and unreliable.

Jack’s insight was that the same semiconductor materials used to make transistors could be tweaked to produce resistors and capacitors, too. This meant an entire circuit could be produced from a single slice of semiconductor material.

Later that year, on Sept. 12, Kilby presented his invention: An electronic oscillator formed from a small slice of the semiconductor material germanium. The first integrated circuit was born, bringing with it the exponential growth of the electronics industry and the spread of electronic devices throughout every aspect of our lives.

 

Learn how Jack Kilby’s integrated circuit and our people helped land man on the moon 50 years ago.



Unleashing creativity in electronics

Our company needed a showcase device, something that could prove the integrated circuit's potential to take large, unwieldy and expensive technology out of specialized computing labs and into the wider public's offices, homes and pockets. They settled on a hand-held calculator.

At the time, a calculator was a large desktop machine that required a constant AC power supply. When our prototype was unveiled in 1967, it used just four integrated circuits to perform addition, subtraction, multiplication and division. It weighed 45 ounces and could fit in the palm of your hand.

Chip complexity began to grow exponentially, now that the creative energy of electronics engineers was finally unleashed from the constraints of wiring together individual transistors. The most significant result was the creation of the first microprocessor, which packed the workings of an entire central processing unit into less than a square inch – and supported the development of the first portable computers.

Circuits in space

The miniaturization of electronics came in handy for coordination of the first moon mission, since the idea of launching the car-sized mainframes used by NASA's ground control up on a rocket itself was both physically and economically impossible.


Instead, the space agency created the world's first integrated circuit-based spaceflight control system, the Apollo Guidance Computer. In July 1969, running around 145,000 lines of code with 12,300 transistors, this 70-pound computer successfully coordinated both Neil Armstrong and Buzz Aldrin's arrival on the moon – and their safe return to Earth eight days later.

Ubiquitous Computation

Chips many thousands of times more powerful than those used in the Apollo Guidance system can now be found everywhere from factory robots to car dashboards, cell phones, computers, smart watches and smart speakers. They can even be found inside the ears of millions of people around the world in the form of hearing aids.


Forty-two years after that pivotal couple of weeks in 1958, Jack accepted half of the Nobel Prize in physics for his invention. In his acceptance speech, he reflected on the host of electronics innovations that have been developed since – far beyond what he’d imagined possible at the time: "It's like the beaver told the rabbit as they stared at the Hoover Dam. 'No, I didn't build it myself, but it's based on an idea of mine.'"