Home | Feed aggregator | Sources |

Design News

Design News
Serving the 21st Century Design Engineer
Updated: 13 hours 31 min ago

Tutorial: What are the differences between force, torque, pressure and vacuum?

Mon, 2019-12-09 18:39

Most second-year university engineering students can easily explain the differences between force, torque and pressure. The reason for their confident answers is that engineering schools typically require a term of study in both static and dynamic forces by a student’s sophomore year. However, from that point on, further studies in these areas are usually confined to aerospace, civil and mechanical engineering disciplines. Few electronic engineers need or will take advanced force mechanic courses.

But modern advances in material properties and device miniaturization as in micro-electro-mechanical systems (MEMS) and sensors mean that force, torque and pressure are relevant across all of the major disciplines. A quick technical review will help remind everyone of these basic concepts.

Force

Simply put, a force is a push or a pull upon an object. A force can cause an object with mass to change its velocity, i.e., to accelerate. Since a force has both magnitude and direction, it is a vector quantity.

A unit of force in the International Systems (or SI) of units is a newton. One newton is defined as the unit of force which would give to a mass of one kilogram an acceleration of 1 meter per second, per second. In terms of an equation, force equals mass times acceleration (F = ma).

Actually, Newton’s Second Law of Motion defines force as the change in momentum over time, not mass through an acceleration. But the momentum equation is reduced to F=ma for basic engineering calculations.

Sometimes the word “load” is used instead of force. Civil and mechanical engineers tend to make calculations based on the load in which a system (e.g., a bridge) is resisting the force of gravity from both the weight of the bridge as well as the vehicles driving over it.

Newton’s Laws have been called the basis for space flight. According to NASA, understanding how space travel is possible requires an understanding of the concept of mass, force, and acceleration as described in Newton’s Three Laws of Motion. Consider a space rocket in which the pressure created by the controlled explosion inside the rocket's engines results in a tremendous force known as thrust. The gas from the explosion escapes through the engine’s nozzles which propels the rocket in the opposite direction (Law #3), thus following F=MA (Law #2) which lifts the rocket into space. Assuming the rocket travels beyond Earth’s atmosphere, it will continue to move into space even after the propellant gas is gone (Law #1).

Newton’s Three Laws of Motion

1.

Every object in a state of uniform motion will remain in that state of motion unless an external force acts on it.

2.

Force equals mass times acceleration [F = ma]

3.

For every action there is an equal and opposite reaction.

Torque

The first university course in static forces is usually followed by a course in dynamic forces in which the idea of rational force or torque is introduced. Torque is the tendency of a force to rotate or twist an object about an axis, fulcrum, or pivot. It is the rotational equivalent of linear force.

Formally, torque (or the moment of force) is the product of the magnitude of the force and the perpendicular distance of the line of action of force from the axis of rotation.  The SI unit for torque is the newton metre (N•m). 

Image Source: Wikipedia by Yawe (Public Domain)

Deriving the equation for torque is often done from a purely force perspective. But it can also be accomplished by looking at the amount of work required to rotate an object. This was the approach the Richard Feynman used in one of his lectures on rotation in two-dimensions.

“We shall get to the theory of torques quantitatively by studying the work done in turning an object, for one very nice way of defining a force is to say how much work it does when it acts through a given displacement,” explained Feynman.

Feynman was able to show that, just as force times distance is work, torque times angle equals work. This point is highlighted in several avionic and aeronautical examples from NASA’s Glenn Research Center where NASA designs and develops technologies for aeronautics and space exploration. Force, torque and pressure concepts continue to exert their influences far beyond the earth’s atmosphere. Concern the release of a large satellite like the Cygnus Cargo Craft from the International Space Station (ISS). The satellite is connected to a large robotic arm that removes it from the ISS prior to release into space. The robotic arm acts just like a huge moment of force in space subject to forces, torques and pressure acting in space.

Image Source: NASA Glenn Research Center

Pressure

Pressure is the force per unit area applied in a direction perpendicular to the surface of an object. Many of us are familiar with gauge pressure from measuring tire pressures. Gage pressure is the pressure relative to the local atmospheric or ambient pressure. This is in contrast to absolute pressure or the actual value of the pressure at any point.  This will make more sense shortly.

Pressure is the amount of force acting per unit area. The SI unit for pressure is the pascal (Pa), equal to one newton per square meter (N/m2). Pressure is also measured in non-SI units such as bar and psi.

In his lecture on the The Kinetic Theory of Gases, Feynman introduced the concept of pressure by thinking about the force needed for a piston plunger to contain a certain volume of gas inside a box. The amount of force needed to keep a plunger or lid of area A would be a measure of the force per unit area of pressure. In other words, pressure is equal to the force that must be applied on a piston, divided by the area of the piston (P = F/A).

Image Source: CalTech – Feynman Lectures

Applications for pressure technologies exist both on and off the planet. In space, however, pressure is so low that it may almost be considered as non-existent. That’s why engineers often talk about vacuum rather than pressure in space applications. A vacuum is any pressure less than the local atmospheric pressure. It is defined as the difference between the local atmospheric pressure and the point of a measurement. 

While space has a very low pressure, it is not a perfect vacuum. It is an approximation, a place where the gaseous pressure is much, MUCH less than the Earth’s atmospheric pressure.

The extremely low pressure in the vacuum of space is why humans need space suits to provide a pressurized environment. A space suit provides air pressure to keep the fluids in our body in a liquid state, i.e., to prevent our bodily fluids from boiling due to low pressure (via PV = nRT). Like a tire, a space suit is essentially an inflated balloon that is restricted by some rubberized fabric.

Homework question: Why didn’t’ the wheels on the Space Shuttle bust while in space, i.e., in the presence of a vacuum? Look for the answer in the comments section. 

In summary, force, torque, pressure and vacuum are important physical concepts that – thanks to advances in material sciences and MEMS devices – cross all of the major disciplines. Further, these fundamental concepts continue to have relevance in applications like space systems among many others.

The 15 Most Influential Technologies of the Decade

Mon, 2019-12-09 12:00

 

RELATED ARTICLES:

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.

 

A climate-change solution: remove carbon dioxide from the air

Mon, 2019-12-09 07:00

Researchers at MIT have designed a specialized battery that can remove carbon dioxide even at small concentrations from air, a device they believe could be used as a tool to fight climate change.

Chemical engineers, including Saha Voskian—an MIT postdoc who developed the work as part of his PhD research—invented the technique, which is based on passing air through a stack of charge electrochemical plates.

In this diagram of a new system invented at MIT, air entering from top right passes to one of two chambers (the gray rectangular structures) containing battery electrodes that attract the carbon dioxide. Then the airflow is switched to the other chamber, while the accumulated carbon dioxide in the first chamber is flushed into a separate storage tank (at right). These alternating flows allow for continuous operation of the two-step process. (Image source: MIT)

The system can process carbon dioxide at any concentration level, even down to the roughly 400 parts per million currently found in the atmosphere. This is different from most current methods of removing carbon dioxide from air. Other methods require a higher concentration of the CO2 in the air, such as what’s present in the flue emissions from fossil fuel-based power plants.

There are a few solutions that can work with lower concentrations, but the new device that Voskian and his collaborator, T. Alan Hatton, the Ralph Landau Professor of Chemical Engineering, have invented is less energy-intensive and expensive. “Our system relies solely on electrical energy input and does not require and thermal or pressure swing,” Voskian told Design News. “It operates at ambient conditions.”

Absorbing carbon from the air

The device is essentially a large battery that absorbs carbon dioxide from the air, or another gas stream, passing over its electrodes as it is being charged up. As it is being discharged, it releases the gas it collected.

In operation, the device would simply alternate between charging and discharging, with fresh air or feed gas being blown through the system during the charging cycle, and then the pure, concentrated carbon dioxide being blown out during the discharging. “The device comprises of a stack of electrochemical cells with gas flow channels in between,” said Voskian. “The cells have porous electrodes which are coated with a composite of electro-active polymer and conductive material. The polymer responds to electric stimuli and is activated or de-activated based on the polarity of the applied potential.”

That electro-active polymer and conductive material is a compound called polyanthraquinone, which is composited with carbon nanotubes. This gives the electrodes a natural affinity for carbon dioxide so they can readily react with its molecules in the airstream or feed gas, even when it is present at very low concentrations.  “The electrodes either have strong affinity to carbon in their electrochemically activated mode, or have no affinity whatsoever,” said Voskian.

Binding gas molecules even in low concentrations

This binary nature of the interaction with carbon dioxide lends the system its unique properties, giving it the ability to bind to the gas molecules from very low to very high concentrations.

The reverse reaction takes place when the battery is discharged, ejecting a stream of pure carbon dioxide. During this process, the device can provide part of the power needed for the whole system, which operates at room temperature and normal air pressure.

Researchers published a paper on their work in the journal Energy and Environmental Science.

RELATED ARTICLES:

The team envisions the solutions used in a wide array of applications—from industrial plants with high carbon-dioxide emissions, to medical and even consumer applications—given its versatility. The researchers have set up a company called Verdox to commercialize the process and device, and hope to develop a pilot-scale plant for carbon-dioxide processing within the next few years.

Elizabeth Montalbano is a freelance writer who has written about technology and culture for more than 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco and New York City. In her free time she enjoys surfing, traveling, music, yoga and cooking. She currently resides in a village on the southwest coast of Portugal.

DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

 

What does every engineer want for the holidays?

Mon, 2019-12-09 06:00

During the holiday season, one tends to think of presents. But today’s designers, manufacturers and sellers tell us the product is but a commodity and what we really want is the experience.

Engineers and scientists are really like most ordinary consumers except in their interest in experiences that deal with great technical achievements, failures and the future – technologies that are yet to be. So, rather than a set of catchy products, this list will focus on unique experiences with particular appeal to engineers and scientists. 

I. Books 

Reading is an experience unlike no other in that it can be done by any literate person at almost any time and in any place. Here is a very short list of science and engineering related books released in 2019:

> Infinite Powers: The Story of Calculus – The Language of the Universe, by Steven Strogatz (Atlantic Books) 

This is the story of mathematics’ greatest ever idea: calculus. Without it, there would be no computers, no microwave ovens, no GPS, and no space travel. But before it gave modern man almost infinite powers, calculus was behind centuries of controversy, competition, and even death.

Professor Steven Strogatz charts the development of this seminal achievement from the days of Archimedes to today’s breakthroughs in chaos theory and artificial intelligence. Filled with idiosyncratic characters from Pythagoras to Fourier, Infinite Powers is a compelling human drama that reveals the legacy of calculus on nearly every aspect of modern civilization, including science, politics, medicine, philosophy, and much besides.

> Six Impossible Things: The ‘Quanta of Solace’ and the Mysteries of the Subatomic World, by John Gribbin (Icon Books Ltd.) 

Quantum physics is strange. It tells us that a particle can be in two places at once. Indeed, that particle is also a wave, and everything in the quantum world can be described entirely in terms of waves, or entirely in terms of particles, whichever you prefer.

All of this was clear by the end of the 1920s. But to the great distress of many physicists, let alone ordinary mortals, nobody has ever been able to come up with a common sense explanation of what is going on. Physicists have sought ‘quanta of solace’ in a variety of more or less convincing interpretations. Popular science master John Gribbin takes us on a tour through the ‘big six’, from the Copenhagen interpretation via the pilot wave and many worlds approaches.

> Hacking Darwin: Genetic Engineering and the Future of Humanity by Jamie Metzl (Sourcebooks) 

At the dawn of the genetics revolution, our DNA is becoming as readable, writable, and hackable as our information technology. But as humanity starts retooling our own genetic code, the choices we make today will be the difference between realizing breathtaking advances in human well-being and descending into a dangerous and potentially deadly genetic arms race.

Enter the laboratories where scientists are turning science fiction into reality. Look towards a future where our deepest beliefs, morals, religions, and politics are challenged like never before and the very essence of what it means to be human is at play. When we can engineer our future children, massively extend our lifespans, build life from scratch, and recreate the plant and animal world, should we?

Image Source: Sourcebooks

II. Engineering Coding Boot Camps 

All engineers need to stay current in their own discipline as well as learn new skills. What better way to accomplish that goal that with an uber-focused bootcamp.

> Flatiron School

Flatiron School offers on-campus (throughout the US) and online programs in software engineering, data science, and UX/UI Design.  The school’s immersive courses aim to launch students into careers as software engineers, data scientists, and UX/UI designers through a rigorous curriculum and the support of seasoned instructors and personal career coaches. Through labs and projects, this school teaches students to think and build like software engineers and data scientists. The  UX/UI Design includes a client project to give students client-facing experience.

> Hack Reactor

This 12-week immersive coding school provides software engineering education, career placement services, and a network of professional peers. The school has campuses in major US cities as well as an online. During the first six weeks at Hack Reactor, students learn the fundamentals of development, full stack JavaScript and are introduced to developer tools and technologies. In the final six weeks, students work on personal and group projects, using the skills they have learned. After 800+ hours of curriculum, students graduate as full-stack software engineers and JavaScript programmers.

> Codesmith

This program offers a full-time, 12-week full stack software engineering bootcamp in Los Angeles and New York City. Codesmith is a selective program focusing largely on computer science and full-stack JavaScript, with an emphasis on technologies like React, Redux, Node, build tools, Dev Ops and machine learning. This program enables Codesmith students (known as Residents) to build open-source projects, with the aim of moving into positions as skilled software engineers. Codesmith Residents gain a deep understanding of advanced JavaScript practices, fundamental computer science concepts (such as algorithms and data structures), and object-oriented and functional programming. The program helps residents develop strong problem-solving abilities and technical communication skills.

(Image Source: Kelly Sikkema on Unsplash)

III. Engineer-themed video games

Tired of playing Minecraft, Tetris and other teckie games? Add these new challenges to a virtual stocking stuffers.

> Scrap Mechanic

Scrap Mechanic is a multiplayer sandbox game which drops players right into a world where they literally engineer your own adventures! Players choose from the 100+ building parts at their disposal and create anything from crazy transforming vehicles to a house that moves.

> Automachef

Automachef is an indie puzzle game in which players have to build automatic kitchens for a robotic fast food tycoon who believes he's a human. Sounds good, doesn't it?

> Factorio

Factorio is a game in which you build and maintain factories. Players will mine resources, research technologies, build infrastructures, automate production and fighte enemies. Players must use their imagination to design your factory, combine simple elements into ingenious structures, apply management skills to keep it working, and protect it from the creatures who don’t like them.

Image Source: Factorio

Image Source: Mind-Field Escapes

IV. Engineer-Themed Escape Rooms

An escape room is a game in which a team of players cooperatively discover clues, solve puzzles, and accomplish tasks in one or more rooms in order to progress and accomplish a specific goal in a limited amount of time. The goal is often to escape from the site of the game.

While such escape rooms have become popular in recent years, few tend be filled with puzzles that are based on engineering or science. One that fits the latter categories is called

LabEscape, created by University of Illinois physicist. There are 3 separate missions, each dealing with renowned quantum physicist Professor Schrödenberg. Each mission features a unique set of awesome puzzles and challenges, all designed to amaze, delight, and astound!

Another example is the recently opened Mind-Field Escapes. “All Clear” is an engineering-focused mission that takes place in a bomb shelter. The scenario is as follows: It’s been four years and the shelling has stop. Now it’s time for the surviors to come out. Unfortunately, someone fed several of the instruction manual to the rats, which means no one really remembers how everything works. All Clear has electrical, mechanical, pneumatic, hydraulics puzzles and more. It’s fun for any engineer. Other engineering focused future missions will include Mr Harvey’s Room and Dr. K. L. Koff’s lab.

V. Tours for Engineers

Here’s a short list of engineering-related adventures to get off the bucket list.

> Arecibo Observatory

Ever wonder about the radio telescope buried deep in the jungles of Puerto Rico, which has served as a backdrop for TV shows and movies like the X-Files and James Bond, among others. Then maybe a trip to Arecibo is in order.

> Manhattan Project National Historical Park – B Reactor

The B Reactor National Historic Landmark is the world's first full-scale plutonium production reactor and part of the Manhattan Project National Historical Park. Sign up for a tour and learn more about the people, events, science, and engineering that led to the creation of the atomic bombs that helped bring an end to World War II.

> Apollo Mission Control Center

In 2019, NASA finished refurbishing the iconic room where space exploration began. In honor of the 50th anniversary of the Apollo 11 mission to the Moon, the Agency has refurbished the historic mission control center at Johnson Space Center in Houston, where engineers guided astronauts to their one small step.

Image Source: NASA

VI. Movies for the engineer in all of us

Engineers and scientist like a variety of movies and TV shows, especially those that have cool technology or a science fiction theme. Here are three that made the list in 2019.

> Deadly Engineering – 2019 edition, Amazon Original

Engineering failures are Icarus-like moments when our overreaching, greed and desire to conquer the impossible can cost not just reputations, but millions of dollars, the environment and lives. Each episode will focus on one disaster, looking at dramatic archive news footage of the disaster occurring and its devastating impact. Check out a few of the recent episode titles: The Chernobyl Conspiracy, NASA’s Challenger Disaster, Doom on the Titanic, and Nightmare in Hell’s Valley.

> Avengers: Endgame

Whether you are a Marvel fan or not, Endgame presents some pretty cool tech – from Tony Stark’s Ironman suit, Antman’s quantum adventures to the time-traveling machine.

> The Current War

The Current War is the latest film to retell the major events of the decade-long battle between Thomas Edison, George Westinghouse and Nikola Tesla to bring electricity to America of the late 1800’s. This current retelling focuses on the personality differences between these great inventors and entrepreneurs but includes enough technical bits to ensure the film’s interest for electrical, mechanical and manufacturing engineers. It is well worth the price of admission.

Image Source: 101 Studio

 

RELATED ARTICLES:

 

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

Unemployment rate hits historic low, but manufacturing has sluggish year

Fri, 2019-12-06 11:39

The latest reports on the economy show mixed results. The Bureau of Labor Statistics (BLS) reported that 266,000 non-farm payrolls were created in November, pushing the unemployment rate to a historically low 3.5%. Government data released today showed the United States added far more jobs than expected in November, “relieving concerns that one of the brightest spots in the economy may have started to run out of steam,” said Business Insider in its Markets report.

Manufacturing employment also increased in November, noted the Alliance for American Manufacturing (AAM). The sector gained 54,000 jobs, according to the BLS, with the bulk of growth coming from automotive jobs. AAM’s President Scott Paul commented: “With only one month left in 2019, Trump’s promise that manufacturing jobs will boom has sputtered. November’s jobs number was aided by UAW workers securing a new contract and returning to the factory floor.

“Overall, 2019 factory job growth has been incredibly weak, lagging well behind 2018 and underperforming [compared with] the rest of the economy. While there has been periodic bluster about policies to boost infrastructure and stop China’s cheating, no real progress has been made to date. American workers deserve better from the administration and Congress,” said Paul.

Nick Bunker, Research Director at Indeed Hiring Lab, commented to Business Insider, that the high number of jobs added in November doesn’t tell the whole story. “You might forget that the story for most of this year was that the economy was slowing down,” he said. “The slowdown did happen, but we can move into 2020 with a bit more optimism.”

Business Insider reported that while wage growth continued to outpace inflation last month, it “remained stubbornly below what would be expected with an unemployment rate at its lowest level in half a century. Average hourly earnings rose 3.1% year-over-year in November, a slight uptick from a month earlier but short of the peak growth levels seen in early 2019.”

November’s Purchasing Managers Index from the Institute for Supply Management (ISM), released on Dec. 2, showed yet another contraction to 48.1 from October’s 48.3. In fact, most of the index measurements were in the “contracting” mode even though the index showed the overall economy “growing.”

New orders for November fell to 47.2 from October’s 49.1. New export orders also fell from 50.4 (growing) in October to 47.9 (contraction) in November. Production’s contraction slowed from October’s 46.2 to 49.1 in November. Inventories contracted faster, from 48.9 in October to 45.5 in November, and customer inventories fell to levels considered “too low,” from 47.8 in October to 45.0 in November. Order backlogs also dropped 1.1% in November to 43.0.

Comments from respondents to ISM’s November survey included this one from a machinery supplier: “Demand has stabilized for the last half of [the fourth quarter], and production will be stable for the rest of this year.”

A respondent from the plastics and rubber products sector commented, “Heading into the holiday season, we are seeing the backlog decrease, as new orders for 2020 seem lighter than in past years.”

A new report from ResearchAndMarkets (Global Plastic Processing Machinery Markets Report 2019: 2017-2018 Data & CAGR Projections 2019-2023), noted that “increasing demand for processed food and beverages, followed by increasing requirements for packaging, is fueling the overall growth in the plastics processing machinery market. The increasing demand for plastics in a variety of applications is expected to fuel growth of the plastics processing machinery global market. Accuracy, reliability, and energy efficiency play an important role in the growth of plastic processing machinery global market.”

Image: Hywards/Adobe Stock

The 2021 Jaguar F-Type is a fresh-faced cat

Fri, 2019-12-06 07:00

Dan Carney is a Design News senior editor, covering automotive technology, engineering and design, especially emerging electric vehicle and autonomous technologies.

 

DoE achieves breakthrough in artificial photosynthesis

Fri, 2019-12-06 06:00

Researchers at the Department of Energy (DoE) have achieved a chemical reaction that drastically improves upon current methods for artificial photosynthesis, providing a new process for the development of cleaner, hydrogen-based fuels.

Brookhaven Lab chemist Javier Concepcion and Lei Wang, a graduate student at Stony Brook University, devised a scheme for assembling light-absorbing molecules and water-splitting catalysts on a nanoparticle-coated electrode. The result: production of hydrogen gas fuel via artificial photosynthesis and a platform for testing different combos to further improve efficiency. (Image source: Brookhaven National Laboratory, Department of Energy)

Specifically, scientists at the DoE’s Brookhaven National Laboratory have doubled the efficiency of a chemical combo that captures light and splits water molecules so the building blocks can be used to produce hydrogen fuel. “Artificial photosynthesis will enable the production of sustainable and renewable fuels from sunlight, water and carbon dioxide,” Javier Concepcion, a Brookhaven chemist who worked on the project, told Design News

In natural photosynthesis, plants use sunlight to transform water and carbon dioxide into carbohydrates such as sugar and starches. The energy from the sunlight is stored in the chemical bonds holding those molecules together.

Researchers have been seeking ways to develop artificial ways to mimic this natural process by using light to split water into its constituents--hydrogen and oxygen. The idea is that the hydrogen can later be combined with other elements such as carbon dioxide to make fuels.

New platform for cleaner fuel

The Brookhaven team has now developed a platform that integrates two types of materials--chromophores for light absorption, and water-oxidation catalysts to split water into oxygen, electrons, and protons--on the surface of photoanodes. These anodes are electrodes that carry out an oxidation driven by photons.  “The electrons and protons are transported to another electrode (the cathode) where they are combined to produce hydrogen gas, a fuel,” said Concepcion.

The approach uses molecular “tethers”—or simple carbon chains that have a high affinity for one another—to attach the chromophore to the catalyst, researchers said. The tethers hold the particles together so electrons can transfer from the catalyst to the chromophore.

This is an essential step for activating the catalyst—but it also keeps the two elements far enough apart that the electrons don’t jump back to the catalyst, Concepcion said. “This new platform allow us to combine these chromophores and catalysts without the need of complicated synthetic procedures and with precise control of the distance between them,” said Concepcion. “By controlling the distance we can control the rate of electron transfer steps between catalysts and chromophores required for the system to work.”

Researchers published a paper on their work in the Journal of Physical Chemistry C.

Necessity fosters invention

The entire process works like this: Light strikes the chromophore and gives an electron enough of a jolt to send it to the surface of the nanoparticle. From there the electron moves to the nanoparticle core, and then out of the electrode through a wire. Meanwhile, the chromophore, now missing an electron, pulls an electron from the catalyst. As long as there’s light, this process repeats itself, sending electrons flowing from catalyst to chromophore to nanoparticle to wire.

Each time the catalyst loses four electrons, it becomes activated with a positive charge capable of stealing four electrons from two water molecules, which breaks the hydrogen and oxygen apart. The oxygen bubbles out as a gas, while the hydrogen atoms diffuse through a membrane to another electrode. There they recombine with the electrons carried by the wire to produce hydrogen gas, which can be used as fuel.

“This research is fundamental in nature and focused on developing the underlying science that will allow actual real-life applications of artificial photosynthesis,” said Concepcion.

This is key to the future design of cleaner energy, which is “something that mankind has to achieve to ensure the survival of our species,” said Concepcion. “The timeline for this achievement can be argued, but its need is without question.”

RELATED ARTICLES:

Researchers plan to continue their work by studying in detail each of the many processes required for the new system to perform well. “This will include a combination of experimental and computational studies, including machine learning and artificial intelligence to help us to understand the current system and to design better ones,” said Concepcion.

Elizabeth Montalbano is a freelance writer who has written about technology and culture for more than 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco and New York City. In her free time she enjoys surfing, traveling, music, yoga and cooking. She currently resides in a village on the southwest coast of Portugal.

DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

 

3 painless tips for writing documentation

Fri, 2019-12-06 06:00

Writing documentation is not the most exciting endeavor an engineer can embark on. It’s often boring, time consuming, and there are so many more interesting things that could be done. It sometimes amaze me that development projects are documented so poorly – if they are even documented at all. Documentation is meant to help preserve important concepts and information about the development cycle. This information could be used to get up to speed on the product, make decisions about updates and changes, or even to prove that proper procedures were followed to create the product. Here are a set of tips for developing documentation that decreases the pain factor for developers and improves documentation quality.

Tip #1 – Write the documentation as you develop

The problem with a lot of documentation (which includes code comments), is that the documentation is done after the development is completed. Engineers are often in a hurry due to delivery constraints, so they focus on getting things working first and then document second. The problem with this is that there is a good chance that the documentation is never written and if it is, the developer may be writing it weeks or months later which means they have forgotten the design decisions that were made. The resultant documentation is often better than nothing, but lacking in critical steps or thought processes that make it easy to pick-up right where the developer left off.

Following through on these tips can speed up the time it takes to develop documentation. (Image source: Samsung Know)

I’ve found that the best documentation, and the quickest way to develop it, is to document as you go. For example, when I am writing documentation that describes how to setup and run a test, I don’t set it up and then go back and try to remember all the steps I took. I literally create the document and with each step write down what I did and more importantly, why I did what I did. Now when I make a misstep and have to go back and adjust, it’s the perfect opportunity to include a few comments of how to recover the system or what mistakes to avoid.

I’ve also found that by creating the documentation as I go, I can use the documentation to outline what I’m about to do which can help guide my efforts. I’ve always found that taking the time to think through what I’m going to do gathers my thoughts and seems to make me more productive. This works far better than trying to do something “on-the-fly”.

Tip #2 – Pictures are worth 1,000 words

It’s quite amazing to me how in a world that is driven by video, rich images and photographs that the documentation engineers create is almost entirely text driven. I can’t tell you how often I’ll come across documentation that includes almost no pictures whatsoever. I was recently working on a project where an engineer sent me a procedure for setting up a toolchain and deploying production code. The entire document was two pages that was not only difficult to follow but had missing steps and no pictures or diagrams! The engineer even assumed that the reader would know how to wire the development board up to a sensor without a wiring diagram!

While the text-based version could be used to repeat the original procedure, anyone following it would have to find several external, review schematics and make several leaps of faiths in order to successfully complete it. What should have been a one-hour process ended up requiring about four hours. If you are following the first tip which is to write your documentation as you go, taking screen shots of important steps in a procedure or taking a picture with a smart phone only takes about 30 seconds. The result can be documentation that is much clearer and saves the user (which could be the future you) a lot of grief.

Tip #3 – Have a colleague review the documentation

The last tip for us to discuss today, and one that should not be overlooked, is to have a colleague go through your documentation when you are done with it. As engineers, we often make assumptions that someone who comes after us will be thinking the same way that we are or that some information tidbit is obvious. Giving your documentation to a colleague to review will help to ensure that all the required information is included in the document so that if someone comes along later, they will be able to understand the process and reproduce or maintain the system.

A colleague can act as a great sounding board to ensure that everything is required. For example, I mentioned that I had a procedure that was provided to me that didn’t have any images. As I reviewed that procedure, I was able to point out screen shots, diagrams and images that should be added to the documentation that would make it easier for someone to understand what the procedure was and how to replicate it. Having no prior knowledge about the procedure and being forced to repeat it helped to provide critical feedback that resulted in a well-established procedure that is not easy to replicate.

Conclusions

These simple documentation steps might seem obvious, but I know for a fact there are lots of engineers that do not follow these simple tips. I come across lots of projects that are sparsely or not documented at all. It may seem obvious to the developer what needs to be done to use a code base, setup an experiment or whatever. The fact though is that it’s often not obvious and the same developer coming back a year later will often find it takes them time to figure out what they were thinking a year ago.

RELATED ARTICLES:

Following through on these tips can speed up the time it takes to develop documentation. That documentation will also be at a higher quality level. Take the time this month to start putting these into practice and you’ll find that overtime, developing documentation will become painless (or at least a little less painful).

Jacob Beningo is an embedded software consultant who currently works with clients in more than a dozen countries to dramatically transform their businesses by improving product quality, cost and time to market. He has published more than 200 articles on embedded software development techniques, is a sought-after speaker and technical trainer, and holds three degrees which include a Masters of Engineering from the University of Michigan. Feel free to contact him at jacob@beningo.com, at his website, and sign-up for his monthly Embedded Bytes Newsletter.

  DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? 
Register to attend!

 

No quartz needed: The world’s first crystal-less, wireless MCU improves IoT designs

Fri, 2019-12-06 05:30
BAW can enable the next generation of industrial and telecommunication applications by changing how we approach system designs today. (Image source: Texas Instruments)

From blood glucose, blood pressure, and oxygen saturation monitors in the medical sector, to temperature and smoke detectors used in building automation, to e-locks used in building security, wireless microcontrollers (MCU) play a vital role in monitoring and connecting the world around us.

With this in mind, one of the most important innovations in business and commerce remains the ability to move and analyze massive amounts of data. Wireless MCUs and wireless networking are essential to data migration. And the ability to bridge the last mile of data through connected Internet of Things (IoT) devices is a vital part of that journey.

IoT building blocks include sensors that use quartz crystals. Discrete clocking and quartz crystal devices used to achieve wireless connections can be costly, time-consuming, and complicated to develop, however, and are often susceptible to environmental stress in factory automation or automotive applications.

A new technology called bulk acoustic wave (BAW) makes it possible to create simpler, smaller MCU designs while increasing overall performance and lowering costs. BAW can enable the next generation of industrial and telecommunication applications by changing how we approach system designs today.

As shown in Figure 1 (below) BAW consists of a piezoelectric material sandwiched between two electrodes that converts electrical energy to mechanical-acoustical energy, and vice versa. The mechanical resonance of the piezoelectric material generates the clock for the system.

Figure 1: ABAW piezoelectric material. (Image source: Texas Instruments)

The SimpleLink CC2652RB MCU from Texas Instruments integrates BAW technology within a wireless MCU package, eliminating the need for an external quartz crystal, which can be costly, bulky, and time-consuming to design. The space savings enabled by crystal-less solutions is crucial in many emerging applications, such as medical IoT devices.

Compared to external crystal MCU solutions, the SimpleLink C2652RB also shows significant resistance to a variety of acceleration forces and mechanical shock. 

How BAW technology resists mechanical shock and vibration

Two important parameters for measuring vibration and shock are the acceleration force and vibration frequency applied to IoT-connected devices. You’ll find sources of vibration anywhere: inside a moving vehicle; a cooling fan in equipment; or even a handheld wireless device. It is important that clock solutions provide a stable clock with strong resistance against acceleration forces, vibration, and shock, as this assures stability throughout product life cycles under process and temperature variations.

Vibrations and mechanical shock affect resonators by inducing noise and frequency drift, degrading system performance over time. In reference oscillators, vibration and shock are common causes of elevated phase noise and jitter, frequency shifts and spikes, or even physical damage to the resonator and its package. Generally, external disturbances can couple into the microresonator through the package and degrade overall clocking performance.

One of the most critical performance metrics for any wireless device is to maintain a link between the transmitter and receiver and prevent data loss. Without the need for a crystal, BAW technology provides significant performance benefits for IoT products operating in harsh environments. Because BAW technology ensures stable data transmission, data syncing over wired and wireless signals is more precise and makes continuous transmission possible, which means that data can be processed quickly and seamlessly to maximize efficiency.

Evaluating BAW technology with high industry standards

TI has tested the CC2652RB thoroughly against relevant military standards because many MCUs operate in environments susceptible to shock and vibration, such as factories and automotive vehicles. Military standard (MIL)-STD-883H, Method 2002 is designed to test the survivability of quartz crystal oscillators. This standard subjects semiconductor devices to moderate or severe mechanical shock (with an acceleration peak of 1500 g) caused by sudden forces or abrupt changes in motion from rough handling, transportation, or field operation. Shocks of this type could disturb operating characteristics or cause damage similar to what could result from excessive vibration, particularly if the shock pulses are repetitive.

Figure 2 shows a mechanical shock test setup for MIL-STD-883H, while Figure 3 shows the frequency variation of the CC2652RB compared to an external crystal solution. You can see that the maximum frequency deviation is about 2 ppm, while the external crystal solution is about 7 ppm at 2.44 GHz.

Figure 2: Mechanical shock test setup and test setup block diagram. (Image source: Texas Instruments)

 

Figuure3 : Comparing the maximum radio (2.44 GHz) frequency deviation (parts per million) induced by mechanical shock on both BAW and crystal devices. (Image source: Texas Intruments)

Conclusion

BAW technology represents real progress within the evolution of IoT by reducing the amount of space required in some critical devices, like those in the medical field, and enabling the use of IoT in places characterized by frequent shocks or vibrations. BAW technology will be one of the catalysts in the connected world of the future across a vast array of sectors.

RELATED ARTICLES:

Habeeb Ur Rahman Mohammed is a validation manager at Texas Instruments within the connected microcontrollers business unit. He has held many different roles at TI, including design, application and validation engineer and graduated with a Master of Science and Ph.D. in electrical engineering from New Mexico State University.

Bill Xiobing Wu is a validation engineer at Texas Instruments. Bill graduated with a Ph.D. from University of Houston. He previously worked as a system application and characterization engineer for TI devices with Bluetooth Low Energy, Wi-Fi, GPS and FM technology.

  DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard?
 Register to attend!

 

Kraiburg kicks off program to develop renewables-based TPEs

Thu, 2019-12-05 07:50

Kraiburg TPE is embarking on what it describes as an ambitious campaign to develop custom-engineered thermoplastic elastomers containing variable proportions of renewable raw materials. By developing customer-specific and application-specific compounds using renewable raw materials, Kraiburg TPE is aiming to meet the growing demand for environmentally-friendly and sustainable thermoplastic elastomers.

Kraiburg TPE sees tremendous potential for custom-engineered thermoplastic elastomers with adjustable proportions of renewable raw materials of up to 90%, both in the consumer market and also in the industrial and automotive markets.

Kraiburg points out that “bio” is a broad term that is by no means synonymous with “sustainable” in the sense of a strategy for saving resources and protecting the environment. Because even renewable raw materials also have carbon footprints, as well as water footprints, that can have an impact on the environmental balance, depending on their provenance and the way they are grown. Factors that play a decisive role here include irrigation, fertilizers, transport energy and energy consumed for reprocessing.

“Part of the challenge involves taking into account the environmental balance of the materials’ whole life cycles, including their impact on ecosystems and people’s health,” emphasizes CEO Franz Hinterecker from Kraiburg TPE. “It has also become apparent that what our customers expect from the properties of ‘biomaterials’ varies widely depending on the application – while at the same time we have to meet strict criteria regarding the materials’ conformity and performance.”

Kraiburg TPE’s modular system makes it possible to develop customer-specific materials with different proportions of renewable raw materials. Typical performance characteristics that are also relevant here include mechanical properties such as tensile strength and elongation, as well as processability, heat resistance and adhesion to ABS/PC or PP and PE, for example. The requirements are determined in close collaboration with each customer and translated into a sustainable and cost-effective solution by our developers.

In classical approaches, it is technically possible to produce bio-based materials with very high proportions of renewable raw materials. However, materials of this kind usually suffer from very high raw materials costs, while providing only very limited mechanical properties. However, the modular system has now enabled Kraiburg TPE to resolve this contradiction almost completely by following a new, innovative approach beside the classical one.

The initial pilot projects based on the classical approach are showing a trend towards bio-based, certifiable proportions of 20% and more. Their potential use extends to all TPE applications in the consumer, industry and automotive markets. Examples range from toothbrushes and hypoallergenic elastic watch straps to fender gaskets.

Kraiburg TPE sees tremendous potential for custom-engineered thermoplastic elastomers with adjustable proportions of renewable raw materials of up to 90%, both in the consumer market and also in the industrial and automotive segments.

Qualcomm has big plans for 5G in 2020

Thu, 2019-12-05 07:00
The Snapdragon 865 (shown) can handle 5G and boasts an AI engine twice as powerful as the previous Snapdragon model and the ability to support 8K video and up to a 200-megapixel camera. (Image source: Qualcomm) 

Qualcomm's latest Snapdragon platforms are aimed squarely at bringing 5G devices to consumers next year.

This week, at its annual Snapdragon Tech Summit, the chipmaker unveiled two new mobile computing platforms – the Snapdragon 765 and 865 – both targeted at 5G speeds and artificial intelligence processing for Android-based devices.

 “We need systems that put 5G and AI together,” Alex Katouzian, senior vice president and general manager, mobile at Qualcomm, told the Tech Summit audience. He outlined Qualcomm's roadmap for 2020, where the company plans to be a part of 5G devices released at all tiers, with AI also ubiquitously integrated into them.

The Snapdragon 765 looks to be the more consumer-focused platform. With Qualcomm's X52 5G modem integrated, the 765 supports both millimeter wave (mmWave) and sub-6 frequencies for 5G and is capable of download speeds of up to 3.7 gigabits per second (Gbit/s), according to Qualcomm. It also supports 5G SA and NSA modes, TDD and FDD with dynamic spectrum sharing (DSS), global 5G roaming, and support for multi-SIM.

Katouzian said the 765 is targeted at serving three major pillars – photo and video, AI, and multiplayer gaming. The platform is equipped with the fifth generation of Qualcomm's proprietary AI Engine for handling various tasks such as creating better photos. The engine itself is capable of speeds of up to 5 tera (trillion) operations per second (TOPS). The ISP can capture 4K video and can support up to a 192-megapixel camera. Another version of the 765 – the 765G – will be specially optimized for online gaming experiences (the “g” stands for “gaming”). Snapdragon 765G offers a bit more performance. It's capable of up to 5.5 TOPS and has a boosted GPU for faster graphics rendering.

On the higher end, the Snapdragon 865, which will be packaged with Qualcomm's X55 modem-RFs (the X55 modem is not integrated in the 865 as the X52 is with the 765), kicks things up in terms of horsepower. Aimed at more premium applications, the 865 is targeting download speeds exceeding 5 Gbit/s, again using mmWave and sub-6. The 865's AI Engine's processing speed reaches up to 15 TOPs – double the performance of the previous Snapdragon, the 855.

Qualcomm is touting the 865 as the “worlds first 2-gigabit-per-second camera capable ISP.” The platform can capture 8K video at 30 frames per second. And when filming 4K video each frame can be captured at 64 megapixels. It also supports up to a 200-megapixel camera.

Overall, Katouzian said the 865 offers a 25% increase in graphics performance over the 855 –meaning desktop features, like high-quality gaming, can be brought into the mobile space.

The 865 and 765/G will also be available as modular platforms.

RELATED ARTICLES:

5G hardware is coming

Qualcomm has already actively secured partnerships around the 765 and 865 and has been very active in pushing for 5G-enabled hardware for both mobile and desktop applications. Earlier this year Qualcomm and long-time collaborator Lenovo unveiled Project Limitless – a concept for a 5G-enabled PC. Based on Qualcomm's 7-nanometer 8cx 5G compute platform, Project Limitless demonstrated the idea of an “always on, always connected” PC that draws on 5G connectivity for cloud-based applications and storage as well as distributed computing.

According to Sergio Buniac, president of Lenovo subsidiary Motorola, the next generation of the newly released Razr will be based on Qualcomm's latest platforms. Motorola turned heads (and sparked some heavy early 2000's nostalgia) earlier this year when it announced the 2020 re-release of its once popular flagship phone, the Razr.

The new Razr is a foldable, clamshell phone based on the Snapdragon 710 platform. It'll be the latest device in a new wave of foldable screen devices coming to market such as Samsung's new Galaxy Fold, and Microsoft's Surface Neo foldable, dual-screen laptop.

Chinese electronics company Xiaomi has also made a big commitment to 5G.Speaking at the Snapdragon Tech Summit, Xiaomi's president, Bin Lin, said the company is committed to launching more than 10 5G smartphones in 2020. Among these will be the Mi 10, a Snapdragon 865-based phone that will feature a 108-megapixel camera.

Lin said Xiaomi also believes 5G will usher in new form factors for phones as well – enabling concept phones like the Xiaomi's Mi Mix Alpha – a snartphone with a 180-degree wraparound, touchscreen display – to come to reality.

Qualcomm expects the first 765- and 865-based devices to hit markets as soon as the first quarter of 2020.

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.

DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

The 2020 Ford Mustang Shelby GT500 is a towering achievement for Ford's engineers

Thu, 2019-12-05 06:30
The 2020 Ford Mustang Shelby GT500 poses with its 1968 Shelby GT500 forebear. (Image source: Ford Motor Co.)

“It was a lot of blood, sweat and tears from the team,” for the new 2020 Ford Mustang Shelby GT500 to achieve its astonishing levels of power, performance and capability, stated chief program engineer Ed Krenz.

The 760-horsepower, 625-lb.-ft. supercharged 5.2-liter double overhead cam V8 engine transfers power seamlessly through a 7-speed Tremec TR-9070 dual-clutch transmission to accelerate the car to 60 mph in 3.3 seconds and through the quarter mile in 10.7 seconds. Our own test runs at the LVMS drag strip testing the Shelby’s impressive launch control system produced a pair of 11.4-second runs, so we didn’t spend any more time in pursuit of a few more tenths of a second.

The car boasts an incredible list of top-drawer components, because nothing less would have a chance to produce these results. Those impeccable parts contribute to the car’s astonishing performance numbers. But more important is the integration of those parts into a functional whole that is easy to drive fast thanks to the aforementioned engineers’ blood, sweat and tears.

This is in contrast to similarly powerful ground pounders, most especially the previous-generation 667-hp GT500 and the famously powerful 707-hp Dodge Challenger Hellcat and its 797-horsepower Redeye and 808-hp Demon iterations.

Those are cars that boasted impressive power, but struggled to make use of their muscle. Ford cast the 2014 GT500 as a drag racer, taking journalists to a drag strip to test its mettle but avoiding road racing courses.

Dodge might have followed the example, because while the different versions of the Challenger are all amazingly fast at the strip – the Demon famously finished the quarter mile in less than 10 seconds, earning it a letter banning it from NHRA-sanctioned races for being too fast for a car without a protective roll cage – none of them stop or turn in a way that inspires any kind of confidence. 

Image source: Ford Motor Co.

The previous GT500 and the Challenger variants all feature some impressive hardware, but it was seemingly not the right impressive hardware, or the pieces weren’t correctly integrated to produce the desired result. The 2020 GT500 is properly sorted and does produce the desired result, thanks to hard engineering work on the details.

It accomplishes that with a track wing aero package that produces as much as 550 lbs. of downforce at the rear of the car at 180 mph. Careful aerodynamic tuning at the Windshear rolling road wind tunnel in North Carolina ensured stable aerodynamic characteristics, while PowerFLOW computational fluid dynamics software from Dassault Systèmes modeled the cooling necessary for an engine producing this much power.

Image source: Ford Motor Co.

Indeed, Krenz reminds us that while the GT500’s net power rating is 760 hp, at redline, its 2.65-liter Eaton Roots-type supercharger is consuming about 100 hp, so the engine is actually creating 860 horsepower, and needs to shed a corresponding amount of heat.

That’s what the car does, with a front opening that twice the size of that in the Shelby GT350, which lets the GT500 expel more than 230 kilowatts of heat. Much of that vents through a central vent in the hood that measures 6 square feet. Even with that ventilation, air pressure beneath the hood is so high that the hood was visibly lifting an inch during high-speed runs, so they added locking hood pins to secure it separately from the regular latch.

This cutaway version of Eaton's supercharger reveals the inverted layout, with the rotors at the bottom and the intercooler's air-to-water heat exchanger at the top. (Image source: Ford Motor Co.)

That supercharger is Eaton’s latest, and it provides a maximum boost of 12 psi. It is an inverted design, placing the spinning rotors at the bottom of the device and the air-to-water heat exchanger for the intercooling system at the top. This helps tuck the blower more compactly into the valley of the engine’s vee, and it puts the heavy rotor shafts lower, contributing to a lower center of gravity.

Down at the bottom of the engine, oil drains into a conventional wet sump oil pan rather than a racing-style dry sump, as is used by the new C8 Corvette. With more space available beneath the engine in the taller Mustang body, Ford engineers simply upgraded the oil pan with side-saddle reservoirs featuring active hinged doors that trap oil to ensure a consistent supply during hard cornering maneuvers. 

This all requires a huge amount of fuel, and Krenz tells us that while the GT500 was held to all of Ford’s regular durability tests, it got special dispensation for failing to complete the mandatory 30 minutes of full-power track testing. That was because the car emptied its 16-gallon fuel tank just 25 minutes into the test! 

We similarly found that during track testing, the car’s fuel light warning of only 50 miles of remaining driving range illuminated very shortly after filling the tank, which translates to something in the neighborhood of 3.5 miles per gallon during track testing. The EPA says the Shelby is good for 12 mpg in city driving and 18 mpg on the highway.

The GT500's cast oil pan contributes to the engine's structural rigidity and employs a maze of baffles and hinged doors to preserve a continuous oil supply even in high-g corners. Image source: Ford Motor Co. Image source: Ford Motor Co.

To bring all this velocity to a halt, Brembo supplied monstrous six-piston front calipers that have 20 percent more swept area than even the huge Brembos on the GT350. With the enlarged 420mm (16.53-inch) rotors, the GT500’s front brakes possess 30 percent more thermal mass than the GT350’s capable front brakes. The rear brakes feature an interesting production application for 3D printing: an unglamorous bracket for the wire running to the car’s electric parking caliper.

With stopping thoroughly sorted, Ford turned its attention to making a car this big and heavy handle like a sports car. Incredibly, they achieved that goal. A primary contributor to that achievement is a magnificent magnesium front strut tower brace that reinforces the front suspension’s stability despite the enormous forces at work.

Image source: Ford Motor Co.

Designing and casting the magnesium brace took the expected amount of effort, and was straightforward, Krenz reported. What was unexpected was the subsequent interaction between the magnesium brace and the steel strut towers to which it is bolted. The dissimilar metals caused corrosion to erupt in test cars, and not just in wet rainy climates. Just regular summer humidity was enough to cause corrosion. The Ford team found a solution with coated washers that isolate the two metals from one another, Krenz said.

The cornering force that is trying to twist the GT500’s body out of shape and is being resisted by that brace is generated through the car’s 20 x 11-inch wheels wrapped in 305/30R20 front and 315/30R20 rear Michelin Pilot Sport Cup 2 tires for maximum grip. The BWI Magnaride actively adjustable dampers and larger 36mm front and rear anti-roll bars help preserve an even keel when cornering or braking, without the pitch and roll exhibited by the previous edition or the various Challenger models.

Just as the braking is stable and confidence inspiring at the very limit, so the steering is remarkable precise, letting the driver place the car exactly where it should be, and with the stunning grip of the Michelins, allows the GT500 to carry unexpected speed into corners and maintain its composure.

Engineers attribute a portion of this precision to the carbon fiber wheels of the carbon option package. When wheels get as large as these 20-inchers, there is a substantial amount of deflection at the rim under hard cornering loads which contributes to vague steering response in most such cars. The added stiffness of the carbon fiber wheels is their primary advantage over the forged aluminum wheels on the base car rather than the reduced unsprung mass and rotational inertia, which are commonly cited as the benefits of such costly wheels.

Image source: Ford Motor Co.

Another problem normally found with tires as wide as the 305mm-wide fronts on the GT500 is a tendency for such cars to follow road surface imperfections rather than tracking straight down the road. This is often called “tramlining” after the effect of a tram following its tracks, though Krenz refers to it as “rut wander” as the affected cars follow the usually unnoticed ruts that occur in the wheel tracks of many highways.

Ford avoided this problem in the GT500 by dialing in additional trail in the steering hub uprights, also known as spindles, knuckles or kingpins. This added trail comes from increasing the castor angle, or fore/aft tilt of the upright from a vertical axis, which strengthens the car’s inclination to self-center its steering. This increases both steering effort and the feel fed back to the driver’s hands for what the tires are doing at potential expense of heavy steering at parking lot speeds. But with power assisted steering, that side-effect isn’t noticeable. 

Image source: Ford Motor Co.

What is noticeable is that while cars with fat front tires like top of the line Porsche 911s and Dodge Challengers can wander in their lanes like distracted dogs on their leashes, the GT500 tracks true to its intended course, which is much less fatiguing for the driver.

This solution worked so well that Ford decided to apply the new steering geometry to the GT350 too. Some of the other hardware bits seem like they might also be candidates to filter down to the lighter, naturally aspirated Shelby.

They will help improve that car too, but only if they are applied with the same thoughtful engineering work that tunes the parts as perfectly as the GT500 assemblage of high-performance components has been. It is unusual for a car with such superb componentry to deliver more than the sum of its excellent parts, but that’s exactly what the GT500 has done. This is the poster child for the value of properly engineering a car rather than simply throwing check-list hardware pieces at a performance model and expecting good results.

Image source: Ford Motor Co.

Dan Carney is a Design News senior editor, covering automotive technology, engineering and design, especially emerging electric vehicle and autonomous technologies.

  DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? 
Register to attend!

 

Developing an embedded software build pipeline

Thu, 2019-12-05 06:15

One interesting fact that I’ve noticed about embedded software development is that development processes and techniques tend to lag the general software industry. When I first started to write embedded software back in the late 1990’s, the focus seemed to be on moving away from writing assembly language and adopting C along with the best practices that went with it.

Applications were monolithic beasts will little to no organization. If you look at the goals for the software industry at that time, there was a big push into object-oriented design, reuse and scalability for applications. Even today, the general software industry has adopted build pipelines, continuous integration and test harnesses while the general embedded industry seems to barely realize that these processes exist (at least among companies with market caps well below a billion dollars). Let’s examine how embedded developers can create their own build pipelines.

A build pipeline overview

Developing a more sophisticated build pipeline can have dramatic effects on the embedded software development life cycle. For example, over the lifetime for a software product, a well thought out build pipeline can:

  • Improve software quality
  • Decrease the time spent debugging
  • Decrease project costs
  • Enhance the ability to meet deadlines
  • Simplify the software deployment process

A software build pipeline is nothing more than the process and tools that are used to manage, test and deploy an application. For example, embedded software developers will typically commit their software to a revision control system, manually test their code and then issue an application image that can be manually deployed to their system. This build pipeline is very traditional, but it lacks the sophistication and automation that a modern build pipeline can offer.

A more modern build pipeline that embedded developers can leverage consists of four stages that can be completely automated and manually kicked off. These stages include:

  • Committing software (The manual trigger)
  • Build and Analysis (Automated)
  • Test and Reporting (Automated)
  • Deployment (Automated or manual)

Each stage has its own process and tools that are associated with it, but the last three stages can all be done as part of a continuous integration / continuous deployment process (CI/CD). The idea behind CI/CD is that a developer can commit their code to the repository at the end of the day which then kicks off a series of automated build, metric and tests that can provide a developer feedback the next day or if everything goes well automatically deploy the firmware to devices in the field. An overview for this process and the general tools involved can be seen in the figure below:

This diagram consists of two halves, an upper half that describes the process that is being followed and bottom half that describes the tools that are involved. (Image source: notafactoryanymore.com)

Notice that this diagram consists of two halves, an upper half that describes the process that is being followed and bottom half that describes the tools that are involved. There is also a barrier between the test and report stage and the deployment phase. Software should only be deployed if it has passed all the build and analysis criteria in addition to all the test cases. If any of the build or tests cases create warnings or errors, this feedback can be reported to the developer which triggers updates to the software and then a new commit which then kick-off the automated stages again.

Enhancing the embedded software build pipeline

Creating an automated build pipeline is not going to happen overnight. It takes time to research the right tools, implement them, test the process and then train the engineers on how to use it properly. As I mentioned before, the benefits can easily outweigh those costs. So how does one go about enhancing their software build pipeline?

First, it’s important to make sure that you have a robust revision control process in place. Most teams that I talk with now-a-days use version control. This is a great improvement from just a few years ago, but many teams I talk to will mention that they only commit code once a week or even less than that. I believe that software should be developed in small enough chunks that code is committed at least once a day if not several times a day. Doing so will then allow the pipeline to provide feedback much more frequently.

Second, you need to implement a continuous integration server. One of the most popular ones out there that can also be used by embedded software developers is Jenkins (although there are others out there). A continuous integration server is designed to automate building and deploying software and you’ll find that there are often many integrations that can be used to automate nearly anything you might want.

Third, you’ll want to make sure that your compiler and static code analysis tools can be executed through command line interface or that they include plug-ins for your continuous integration server. What if you don’t have a static code analysis tool? Static code analysis is an important step in the software development process and now is a great time to find one.

RELATED ARTICLES:

Fourth, and perhaps the most difficult is to select a test harness and integrate it into the development process. Automated tests are great for regression testing and verifying software, but they do require that the tests be designed and implemented as part of the development process. Automating tests that don’t test the software fully can leave holes in the software and result in a deployment that is lower quality than one would expect. For this reason, test harnesses should be developed from the beginning if possible and integrated in a process like test driven development (TDD).

Finally, you don’t have to do all of this at once. Enhancing the build pipeline can be done in steps. Start with each of the above steps one at a time and build out your toolchains and processes so that they are rock solid. Once each phase is rock solid, add to it until you eventually have a modern build pipeline that fully automates your build, testing and deployment.

Conclusions

Modernizing the embedded software build pipeline can generate a lot of benefits to the development team and the business in general. Just like with any process though, properly building up a modern build pipeline for embedded systems development requires a time and budget investment in order to architect and implement the pipeline. Given the volatile environment that I see so many development teams in, a modern build pipeline can help to illuminate the path forward, monitor software quality and even simplify software updates.

Jacob Beningo is an embedded software consultant who currently works with clients in more than a dozen countries to dramatically transform their businesses by improving product quality, cost and time to market. He has published more than 200 articles on embedded software development techniques, is a sought-after speaker and technical trainer, and holds three degrees which include a Masters of Engineering from the University of Michigan. Feel free to contact him at jacob@beningo.com, at his website, and sign-up for his monthly Embedded Bytes Newsletter.

  DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

 

Insects inspire design of metal that’s impossible to sink

Thu, 2019-12-05 06:00

Insects that can survive in water are the inspiration behind a new type of metal developed by researchers at the University of Rochester that is so water-resistant it doesn’t sink.

A team in the lab of Chunlei Guo, a university professor of optics and physics, developed the metal, which in tests showed such a high water-repellent aspect that it would not go under the surface even after being punctured and damaged.

The superhydrophobic metallic structure developed by researchers at the University of Rochester remains afloat even after significant structural damage—punctured with six 3-millimeter diameter holes and one 6-millimeter hole. (Image source: University of Rochester)

Diving bell spiders and rafts of fire ants inspired the design of the structure—in particular, the way these creatures can survive long periods under water or on its surface. These creatures manage by trapping air in enclosed areas in their bodies.

For example, the diving bell spider creates a dome-shaped web that is filled with air. The spider carries the air between its hydrophobic legs and abdomen, Guo said. In a similar way, fire ants can form a raft in the water by trapping air in their bodies.

Guo and his team developed a way to use femtosecond bursts of lasers to “etch” the surfaces of metals with intricate micro- and nanoscale patterns. Like the insect behavior, these trap air to make the surfaces superhydrophobic, or water repellent. “The key insight is that multifaceted superhydrophobic (SH) surfaces can trap a large air volume, which points towards the possibility of using SH surfaces to create buoyant devices,” researchers wrote in a paper in ACS Applied Materials and Interfaces.

Creating the ‘unsinkable’ factor

However, the etching alone wasn’t enough to cause a more permanent unsinkable factor; researchers found that after being immersed in water for long periods of time, the surfaces of the etched metal showed a loss of hydrophobic properties.  So the team went one step further and created a structure in which they etched two parallel aluminum plates and faced them inward, not outward, so they are enclosed and free from external wear and abrasion.

Researchers also separated the surfaces of the metallic structure by just the right distance to trap and hold enough air to keep it floating, which acts to create a waterproof compartment. The superhydrophobic surfaces manage to keep the water from entering the compartment even when the structure is submerged in water.

Though the team used aluminum here, the etching process “could be used for literally any metals, or other materials,” Guo said. They tested the metallic structures by forcing them to submerge for two months. Even after this time they immediately bounced back to the surface.

The team even found the structure didn’t sink even after puncturing it multiple times. This is because air remains trapped in the other parts of the compartment or adjoining structures.

RELATED ARTICLES:

The team expects its work can be used to inform the design of metals for ships that will be nearly impossible to sink. It can also be used for wearable floatation devices that remain afloat even after being punctured.

Elizabeth Montalbano is a freelance writer who has written about technology and culture for more than 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco and New York City. In her free time she enjoys surfing, traveling, music, yoga and cooking. She currently resides in a village on the southwest coast of Portugal.

  DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

 

How to think about plastics in 2020

Wed, 2019-12-04 16:29

Since 1950, approximately 8.3 billion metric tons of virgin plastics have been produced worldwide, the equivalent of 176 million big rigs.

Less than 20% of that plastic has been recycled or incinerated, leaving nearly 80% to accumulate in landfills or as litter in our natural environment. Despite its significant contributions to innovation, the plastics industry has garnered increasing criticism over the years for its environmental impact. In a poll conducted by market research firm Morning Consult in 2018, a majority of people (55%) reported that they did not believe corporations were doing enough to reduce waste that could make it into the environment, and two-thirds of individuals (66%) reported that they would view companies more favorably if they implemented policies to reduce plastic waste.

So, why do we continue to use plastics in the first place?

The argument to remove plastics from our way of life entirely is not a feasible option for Alex Hoffer, Vice President of Sales and Operations at Hoffer Plastics Corp.

The technical answer is that plastic has a high strength-to-weight ratio and can be easily shaped into a wide variety of forms that are impermeable to liquids and are highly resistant to physical and chemical degradation. These materials can be produced at a relatively low cost, making it easier for companies to sell, scale, save and so forth. The primary challenge is that the proliferation of plastics in everyday use in combination with poor end-of-life waste management has resulted in widespread and persistent plastic pollution. Plastic pollution is present in all of the world’s major ocean basins, including remote islands, the poles and the deep seas. An additional 5 to 13 million metric tons are introduced every year.

However, consider for a moment the possibility that the plastics industry is doing more good than harm, and that the environmental issues the industry faces have more to do with recycling than production.

Here is how we should be thinking about plastics in 2020.

Plastics and the environment

Austrian environmental consultancy Denkstatt recently conducted a study to determine the impact of farmers, retailers and consumers using recyclable products (wood, tins, glass bottles and jars, and cardboard) to package their goods rather than plastic. What they found was that the mass of packaging would increase by a whopping 3.6 times, and would take more than double the energy to make, thereby increasing greenhouse gases by an astounding 2.7 times.

One common proposal for replacing plastics with different materials is to replace plastic bags with paper ones in grocery stores. While this may sound like a more sustainable solution, the data does not support it. By volume, paper takes up more room in landfills and does not disintegrate as rapidly as plastic. Because of this, plastic bags leave half the carbon footprint of cotton and paper bags.

Plastics and hunger

In my visits to the Northern Illinois Food Bank, I’ve had the honor to serve those in need of access to nutritious food. While helping stock the pantry or pass out holiday baskets, I couldn’t help but notice how food packaging alone impacts visitors’ perceptions. Most of the food at the food bank is canned or jarred, yet it is the plastic-wrapped food that always looks fresher and a little less dangerous.

Now, consider the properties of plastic that make it so attractive: It is durable, flexible, does not shatter, can breathe (or not) and is extremely lightweight. As a result, food and drink are protected from damage and preserved for previously unimaginable lengths of time.

The European Packaging and Film Association (PAFA) says that the average spoilage of food between harvest and table is 3% in the developed world, compared to 50% in developing countries where plastic pallets, crates, trays, film and bags are not as commonly available. This data point shows us that plastics play an integral role in the preservation of food. In a world where many go hungry, it is advantageous to continue to support an industry that helps to keep food on tables and families fed, while reducing food waste.

Plastics and cars

Turning our attention to plastics’ relationship with the automotive industry, let's start with safety. The National Highway Traffic Safety Administration estimates that today’s seat belts, which are made with industrial-strength plastics, have the potential to reduce auto fatalities by as much as 45% and serious injury by 50%, compared with not being buckled in.

Beyond the seat belt and other accessories, modern plastics can be made to be resilient and flexible, soft and cushioned, or tough and shatter-resistant. This allows them to contribute to vehicle safety in a substantial way.

Car manufacturers rely on plastic to make lightweight materials that reduce the weight of automobiles so they can meet the Corporate Average Fuel Economy (CAFE) standard, which is set to increase to 54.5 miles per gallon by 2025. I predict that the use of plastics to minimize the weight of cars will be an integral part of car manufacturers’ efforts to meet these new standards. Therefore, the plastics industry will be contributing in improvements to fuel efficiency that will ultimately reduce the environmental footprint of vehicles.

Plastics and healthcare

Did you know that plastic materials increase the efficiency and hygiene of your physician’s office? Plastic syringes and tubing are disposable to reduce disease transmission. Plastic intravenous (IV) bags and tubing that store and deliver blood, fluid, and medicine let healthcare workers more easily view dosages and replacement needs. Plastic heart valves and knee and hip joints save lives and make patients’ lives more comfortable. Plastic prostheses help amputees regain function and improve their quality of life.

Plastics and jobs

Consider a world in which the plastics industry in America suddenly came to an end. While some would celebrate this, I imagine that the cheers from those who are “anti-plastic” would very quickly be drowned out by the 989,000 individuals in the United States who collect their paychecks and support their families thanks to job opportunities within the plastics industry. 

In 2020, the argument to remove plastics from our way of life entirely is not a feasible option. Plastics’ contribution to the health of our environment, the safety and durability of our healthcare products, the fuel efficiency on our roads and the growth of the economy—and so much more—tells us that it is worth putting our best efforts toward understanding this debate further.

About the author

Alex Hoffer is Vice President of Sales and Operations at Hoffer Plastics Corp., a leading global supplier of tight-tolerance, custom injection molded parts. He leads the company’s sales growth strategy across a diverse set of markets, including flexible and rigid packaging, automotive, appliances and consumer industrial. Alex Hoffer’s leadership in developing the Trust-T-Lok product line for spouted pouches has helped to supply more than one billion Trust-T-Lok fitments to the international marketplace. Today, his focus is on launching a fully recyclable pouch, and utilizing spouted pouch technology to address food waste and other human impact challenges.

10 tiny satellite technologies

Wed, 2019-12-04 07:15

Tiny satellites have made space accessible to a new generation of university students, private companies and even helped cash-strapped government agencies like NASA. Generally known as nano-satellites (nanosats) or cube-satellites (cubesats), this technology has been made possible by the semiconductor driven miniaturization of electronic and electro-mechanical systems. In recognition of the trend, the IEEE has even launched a new journal on, “Miniaturization for Air and Space Systems (J-MASS).”

Mass is a premium consideration when placing anything into space. That’s why the names of tiny satellites depends upon their mass. Nanosats are the general category for any satellite with a mass from 1 kg to 10 kg. Nanosats include the categories of well-known cubesats and perhaps less well known PocketQubes, TubeSats, SunCubes, ThinSats and non-standard picosatellites. Chipsats - cracker-size, gram-scale wafer miniprobes - are not considered nanosats but have been called attosats by some.

Cubesats (cubesatellite, cube satellite) are a type of nanosatellites defined by the CubeSat Design Specification (CSD), unofficially called the Cubesat standard.

The original goal of all these tiny, miniature satellites was to provide affordable access to space for the university science community. Many major universities now have a space program, as do several private company startups and even government agencies like NASA and the DoD.

The focus of this slideshow is to show nanosat technologies, from the carriers and launch mechanisms to several NASA cubesats performing a variety of missions. We’ll end with an example of a chipsat. Let’s begin!

RELATED ARTICLES:

 

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

 

Here’s how energy consumption has changed since 1850

Wed, 2019-12-04 07:00

Did you ever wonder how much electrical energy people consumed in the past? Ongoing surveys by the EIA provide answers to those questions while comparing them to country, population size, energy source, and more. For example, the average annual electricity consumption for a US residential utility customer in 2018 was 10,972 kilowatthours (kWh) per year for an average of about 914 kWh per month or about 30 kwh per day (see below). If the numbers seem confusing, just remember that a kilowatthour is equivalent to a power consumption of 1,000 watts for 1 hour.

On average, the typical American uses 41% of their electric energy on space heating, and 35% on appliances, electronics, and lighting.

In the chart, equivalent energy consumption before electrical power was easily available (e.g., 1850 through 1930) have been converted from wood, coal, and gas sources using the standard conversion, 1 BTU = 0.000293071 KWH.

According to the EIA, the three major fossil fuels – petroleum, natural gas, and coal, which together provided an average of 87% of total US primary energy use over the past decade (2000 to 2010) – have dominated the US fuel mix for well over 100 years. The agency predicts that the continuation of current laws, regulations, and policies will result in a continued heavy reliance on the three major fossil fuels through at least 2035. Still, the total fossil fuel share of energy consumption has been decreasing as renewable energy and nuclear electric power experience modest growth, and non-hydroelectric renewable energy is predicted to more than double between 2009 and 2035.

Take a look at the rest of the survey results in the infographic below:

 

RELATED ARTICLES:

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

 

How Wi-Fi 6 and 5G will transform factory automation

Wed, 2019-12-04 06:30

A key technology trend for automation and control in 2020 and beyond is the emergence of wireless communications including 5G, Wi-Fi 6, LoRaWAN and more. An obvious benefit for factory automation is the use of wireless communication for remote monitoring and remote operation of physical assets but an equally important benefit is an ability to replace cables, unreliable WiFi and the many industrial standards in use today.

Many experts are predicting that 5G will make an outsized impact for Internet of Things (IoT) applications driven by higher performance, increased reliability and robustness along with lower latency but other wireless technologies and new WiFi 6 are also bringing new capabilities expect to make an impact as well.

New WiFi 6 CERTIFIED mark will denote products implementing this new technology. (Image source: WiFi Alliance)

Certification of Wi-Fi 6

One major step forward for wireless technologies in industrial communications is the recent certification of Wi-Fi 6. The announcement by the WiFi Alliance moves this technology ahead by enabling vendors to move toward the release of certified products, in advance of IEEE ratification process of IEEE 802.11ax expected to be completed in 2020.

Wi-Fi CERTIFIED 6 delivers advanced security protocols and requires the latest generation of Wi-Fi security, Wi-Fi CERTIFIED WPA3. 

Here is a short listing of new advanced capabilities:

  • Orthogonal frequency division multiple access (OFDMA): shared channels increases network efficiency and lowers latency for uplink and downlink traffic in high demand environments
  • Multi-user multiple input multiple output (MU-MIMO): allows more downlink data to be transferred at once and enables an access point to transmit data to a larger number of devices concurrently
  • 160 MHz channels: increased bandwidth delivers greater performance with low latency
  • Target wake time (TWT): improves battery life in Wi-Fi and IoT devices
  • 1024 quadrature amplitude modulation mode (1024-QAM): increased throughput by encoding more data in the same amount of spectrum
  • Transmit beamforming: higher data rates at a given range produces greater network capacity

Synergy of 5G and Wi-Fi 6

Wireless vendors are anticipating that 5G and Wi-Fi 6 will be deployed together in smart manufacturing applications. They share technology that makes wireless solutions more deterministic, especially important for mission-critical IoT devices used in factory automation. The anticipated tiered release and extended timeline for 5G deployment is expected to result in Wi-Fi 6 rolling out more quickly than 5G.

A Cisco blog article,  “Comparing Wi-Fi 6 and 5G—it's more than a good connection” provides more information on the synergy between these two technologies.

Wi-Fi and LoRaWAN

Another interesting development is potential new IoT use cases incorporating WiFi and LoRaWAN technologies. According to a new white paper from the LoRa Alliance, new opportunities are being created when Wi-Fi networks that are traditionally built to support critical IoT are merged with LoRaWAN networks that are traditionally built to support low data rate massive IoT applications.

Massive IoT versus critical IoT applications illustrates the wide variety of potential wireless solutions, and specific technology requirements for each area. (Image source: LoRa Alliance)

The argument is that there is a growing set of IoT use cases that rely on connectivity spanning large areas that are also able to handle a large number of connections. LoRaWAN as a technology covers long-range use cases at low data rates. This includes hard-to-reach locations such as temperature sensors in a manufacturing setting or vibration sensors in concrete.

Application areas include smart buildings, residential connectivity along with automotive and smart transportation. Hybrid use cases identified in the paper include location and video streaming.

Tthe full white paper on this topic is available at the LoRa Alliance website.

RELATED ARTICLES:

Al Presher is a veteran contributing writer for Design News, covering automation and control, motion control, power transmission, robotics, and fluid power.

DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

Flexible robot grows like a plant to perform in tight spaces

Wed, 2019-12-04 06:00

Scientists increasingly are designing robots that can perform tasks alongside humans and do things that are too dangerous or impossible for humans. One of those tasks is to navigate tight spaces that humans have a hard time navigating.

The new “growing robot” developed by researchers at MIT can be programmed to grow, or extend, in different directions, based on the sequence of chain units that are locked and fed out from the “growing tip,” or gearbox. (Image source: MIT)

MIT engineers have designed a robot with an appendage flexible enough to twist and turn in any necessary configuration that also can support heavy loads and apply enough power to put together parts in tight spaces. Once one task is complete, the robot can pull back the arm and extend it again at a different length or shape depending on what the next task demands.

While robots currently being designed to perform similar tasks are typically made of soft materials, the MIT robot is different, said Harry Asada, MIT professor of mechanical engineering in whose lab the robot was designed. Instead, they designed a robot by making a clever use of rigid materials.

Inspired by plants

That team--led by Tongxi Yan, a former graduate student in Asada’s lab--took inspiration the way plants grow to inform the design of the robotic arm using these materials. Plants transport fluid nutrients to their tips as they grow, then convert them into a solid material to produce a supportive stem. The MIT team mimicked this by designing the appendage in a similar way.

They designed a gearbox to represent the robot’s growing tip, similar to the bud of a plant, which is the part where nutrients flow up and feed a more rigid stem. In the box, researchers built a system of gears and motors which pulls a fluidized material. The material is comprised of 3D-printed plastic units interlocked with each other, similar to a bicycle chain.

The chain feeds into a box and turns around a winch which then sends it through a second set of motors programmed to lock certain units in the chain to neighboring units. This action creates a rigid appendage as it is fed out of the box.

Achieving new functionality

This design and movement allows the robot to function with a combination of flexibility and strength, which is what is needed for some tasks that robots currently aren’t designed to perform, Asada said. “Think about changing the oil in your car,” said Asada. “After you open the engine roof, you have to be flexible enough to make sharp turns, left and right, to get to the oil filter, and then you have to be strong enough to twist the oil filter cap to remove it.”

The team designed the robot perform such tasks, Yan said. “It can grow, retract, and grow again to a different shape, to adapt to its environment, he said.

RELATED ARTICLES:

Researchers presented their work recently at the IEEE International Conference on Intelligent Robots and Systems in Macau. They envision myriad uses for the machine by mounting grippers, cameras, and other sensors onto the robot’s gearbox so it can make repairs in tight spaces on airplanes or find objects in a warehouse on a high shelf without disturbing other objects.

Elizabeth Montalbano is a freelance writer who has written about technology and culture for more than 20 years. She has lived and worked as a professional journalist in Phoenix, San Francisco and New York City. In her free time she enjoys surfing, traveling, music, yoga and cooking. She currently resides in a village on the southwest coast of Portugal.

DesignCon: By Engineers, For Engineers

January 28-30: North America's largest chip, board, and systems event, DesignCon, returns to Silicon Valley for its 25th year! The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard? Register to attend!

 

Albis Plastic develops compounds for fuel-cell applications

Tue, 2019-12-03 14:04

Albis Plastic (Hamburg, Germany) announced the development of a plastic solution for fuel-cell applications, which is currently being validated in projects with well-known OEMs. The validation process includes Albis’ technical compounds Altech, Alfater SL TPV, Tedur L PPS and Alcom, all of which can be adapted to customer-specific requirements.

Battery-powered cars are currently being introduced to the market on a large scale, such as the VW ID, Audi e-tron, BMW i3, Opel Ampera-e and Mercedes EQC models. Albis stated that it has no doubt that CO2 emissions can be reduced while driving these vehicles, provided the energy comes from renewable sources.

However, this technology poses a number of challenges that need to be addressed, added the company, including the procurement of resources, the maximum range per load and the associated duration of loading times.

Fuel-cell systems require the use of numerous materials, including metals, plastics and sealing materials, in the fuel-cell core itself as well as the hydrogen, oxygen, air supply and cooling circuit. Image courtesy Albis Plastic.

“Hybrid solutions that combine battery and fuel cells are a promising solution here,” said Ian Mills, a member of the Albis Management Board and head of the Compounding business.

Fuel-cell systems require the use of numerous materials, including metals, plastics and sealing materials. These are used both for the fuel cell core itself, the so-called “stack,” and the hydrogen, oxygen, air supply and cooling circuit. They are also used in components such as pumps, valves, compressors, pipes and connectors.

Pollutants, such as volatile components or ions, can contribute to the degradation of the fuel cell through emissions and, thus, reduce its service life and performance by changing the surfaces of the “bipolar plates,” for example. These volatile components can migrate from the materials used in the individual assemblies of the fuel cell.

“The production of a fuel-cell system from completely emission-free components is almost impossible because of the large number of individual parts and attachments,” explained Thies Wrobel, Business Development Manager—Automotive. “Therefore, the materials used must be carefully examined for emissions.”

Another important factor is production of the materials in a consistent, reproducible process using the same raw materials in a clean production environment. Given these considerations and in cooperation with OEMs, Albis has developed materials that have been tested in cooling and air supply systems.

The materials include polypropylene compounds from the Altech PP portfolio with 20%, 40% and 50% glass fibers; PPS compounds from the Tedur L portfolio with 30% and 40% glass fibers plus 15% PTE (for bearing applications); and Alfater TPV, a peroxidically cross-linked thermoplastic vulcanizate with comparable properties to elastomer/rubber in Shore A 60 and 70 hardness (for sealing applications).

Additional compounds will be tested in the future at Albis’ laboratory on a specially installed test rig.