|
Although
not a new concept, network embedded systems are being used in novel
ways in a variety of research initiatives. From civil structures
and defense systems to environmental and health monitoring, networked
embedded systems represent the next generation in computing, communications,
and technologies, where individual sensors react to, communicate
with, organize, and maintain themselves in relationship to each other,
to the entire system, and to the environment in which the system
is placed.
he American
poet James Russell Lowell once said, “There is no good arguing
with the inevitable. The only argument available with an east wind is
to put on your overcoat.” Change is inevitable, but it can also
be exciting. One of the most exciting changes occurring today is the
proliferation of embedded systems and the development of large-scale
distributed systems which include real-time routing, independent data
collection, and autonomous behavior.
“There are two very important notions about embedded systems,” says Panos
J. Antsaklis, the H.C. and E.A. Brosey Professor of Electrical Engineering. “Most
obvious is the fact that they are embedded. You cannot access an embedded system
and change its programming as easily as you could that of a computer. As important
is that the mission of these little computers -- because that’s what a
microprocessor is, and embedded systems are made up of microprocessors --
is not to ‘compute.’ It is to improve the function of the device
in which it is embedded.”
Perhaps the most widely publicized embedded system in consumer products today
is the OnStar® service, which is available in a variety of new cars, trucks,
and recreational vehicles. OnStar tracks vehicles and assists drivers as needed
in real time, providing services such as air bag deployment notification, roadside
assistance, stolen vehicle tracking, remote door unlock, and remote diagnostics.
But it is just one example of an embedded system.
Embedded systems are prevalent in households around the world. Washing machines,
dryers, microwaves, and cell phones all feature embedded systems. They were developed
by engineers who embraced the change that has been steadily progressing since
Jack Kilby and Robert Noyce first introduced the microchip in the early 1960s.
In essence OnStar and other embedded systems take a microprocessor, originally
used to analyze data or interact with a desktop user according to a series of
commands, and instead program it to interact with the real world. The benefits
of using embedded systems in consumer products are obvious; they raise the quality
of life by making products more functional and more efficient.
Equally as positive are the benefits derived when embedded systems technology
is applied to a variety of research projects, such as the work being accomplished
at the University of Notre Dame. Following Lowell’s analogy, faculty in
the College of Engineering are not “putting on their overcoats” in
an effort to shield themselves from the change but to embrace it. Donning their
boots and hats and running headlong into the “east wind,” they are
leading the way in developing network embedded systems for research in disciplines
not previously employing this type of sophisticated technology.
For example,
as part of a National Science Foundation study, Ahsan
Kareem, the Robert Moran Professor of Civil Engineering and Geological
Sciences, and Rooney Family Assistant Professor Tracy
Kijewski-Correa are using networked embedded devices to monitor
the structural performance of several tall buildings in Chicago. They
are working with Skidmore, Owings & Merrill LLP (SOM), one of the
world’s premier architecture and engineering firms and the company
responsible for the design of structures such as the Sears Tower, the
Lever House in New York City, and the Bank of America World Headquarters
in San Francisco. Another partner in the study is the Boundary Layer
Wind Tunnel Laboratory of the University of Western Ontario, a world
leader in commercial wind tunnel testing.
“We’ve been interested in how wind affects the performance of tall
buildings for several years,” says Kareem. “This particular study
focuses on some of the signature structures in the world, which were designed
and built at a time when scale-model testing and computer modeling were not as
advanced as they are today. We want to determine if the structures are behaving
in the manner for which they were designed.”
Questions Kareem and the research team, known as Team Chicago, are
asking include:
Were the procedures used at the time of the structures’ design representative
of realistic loadings and response? Are the structures performing as expected?
And, if they are not, how does that impact design criteria for the next generation
of urban structures?
Modeling technologies have changed over the years, but cityscapes have also changed.
The urban landscape of Chicago, for instance, is much more developed than it
was a few decades ago, when buildings like the Sears Tower and the Aon (Amoco)
Building were designed. Thus, the wind travels through cities and buffets buildings
in a much different manner than it did in the early 1970s.
Kareem
and Kijewski-Correa are using traditional monitoring devices, such as anemometers
and accelerometers, in conjunction with cutting-edge technology such as the Leica
MC500 Global Positioning System (GPS) with Real Time Kinematic potential. Four
accelerometers have been mounted in pairs in opposite corners on the highest
floor of each building in the study. This positioning enables detection of a
building’s motion along its two lateral perpendicular axes, as well as
twisting movements.
“We use high-precision servo-force balance accelerometers,” says
Kijewski-Correa, “because these buildings move at very low amplitudes and
with long periods. It’s not like measuring a seismic event, where you see
much larger levels of motion.”
According to Kareem, stand-alone implementation of this technology does not provide
sufficient accuracy to monitor building displacements as indicators of performance.
In order to make corrections for atmospheric conditions that affect the GPS signal,
he and Kijewski-Correa use a low-rise structure in the city as a base station.
This differential monitoring reduces errors to as little as five millimeters.
Using this measurement protocol, the Notre Dame team can monitor a building’s
movements every one-tenth of a second. (A real-time feed of the data can also
be used by owners in the daily management of the buildings in the study, including
the operation of elevator systems and
skydecks.)
“What’s important to remember is that even before we installed the hardware
and began collecting data, we spent two years calibrating the equipment in relation
to the GPS system,” says Kareem. “Because of this, we are very confident
in our data.”
Information from the sensors is transmitted to a communications hub in the SOM
building in Chicago and then relayed, via the Ethernet, to Notre Dame, where
it is archived in a web-assisted database and analyzed. Scale models of the structures
and the surrounding built environment are then developed in the Boundary Layer
Wind Tunnel to compare the predicted response to full-scale data.
“In essence we’re tracking the vital signs of individual structures
in order
to give us a better indication of in-situ building performance,” says Kijewski-Correa. “By
using conventional and advanced sensors, Notre Dame is taking the lead in the
integrated monitoring of tall structures. We are not designing the sensors themselves,
but we have adapted and prototyped a networked configuration of these devices
for capturing signals peculiar to long-period civil structures. Our findings
could directly impact the architectural and structural communities for years
to come. |
The Adaptive-optic
Challenge
Aero-optics is the study of the interaction of light with a turbulent
flow. The light could emanate from distant space objects or celestial
bodies, or it could be a laser beam. In general, the interaction
of these optical signals with turbulent air has a degrading effect,
which is why stars appear to twinkle. This effect is particularly
devastating to the quality of a laser beam projected from an aircraft,
where the thin layer of turbulent air surrounding an aircraft can
reduce the focus of a laser on a distant target to less than 1
percent of its intensity.
Airborne imaging faces a similar challenge;
for example, an airborne camera might be able to image a vehicle
from 60,000 ft. with sufficient resolution to identify it as a car,
but it may not be able to read the license plate. Using high-speed
wavefront sensors developed at Notre Dame; multiple dedicated, embedded
processors; deformable mirror technology; and the Notre Dame Shear-Layer
Facility, a team of researchers led by Eric J. Jumper, professor
of aerospace and mechanical engineering, is preparing to measure
the distortion of the laser beam, develop the conjugate of the distortion,
adjust a deformable mirror -- which will be part of the embedded
system, and restore the laser beam’s quality
by bending the mirror up to 15,000 times per second. In short,
the team is developing the technology that will allow an aircraft
flying at high Mach numbers to project correctly configured laser
beams, a feat thought impossible only a few years ago. “This
is a very dynamic process,” says Jumper, “so a traditional
approach to an adaptive-optic correction was not feasible. We have
incorporated flow control, high-frequency non-real-time wavefront
sensing, and a new approach to controlling adaptive optics into
making this correction. We could not have achieved our successes
to date without embedded, dedicated processors. There are too many
calculations that need to be made in order to determine the mirror’s
configuration and compensate for the wavefront aberration efficiently
and effectively.” |
|
According to Martin
Haenggi, assistant professor of electrical engineering, networked
embedded systems can also be placed in natural environments, enabling
researchers to observe any kind of habitat at the scale and in the
amount of detail that has never before been possible. Haenggi and a
team of researchers from throughout the College of Engineering are
developing an embedded sensor network for monitoring the hydrology
and ecology of freshwater lakes and streams, the Naiades project.
Named for the nymphs of rivers, lakes, and streams of Greek mythology, Naiades
represents what will be a five-year collaborative effort between researchers
in the Department of Electrical Engineering and the University’s Center
for Environmental Science and Technology (CEST), including team leaders Patricia
A. Maurice, professor of civil engineering and geological sciences and
director
of CEST, and Michael D. Lemmon, associate professor
of electrical engineering. Other faculty currently involved in the project are
Antsaklis; Haenggi; Sharon
Hu, associate professor of computer science and engineering; J.
Nicholas Laneman,assistant professor of electrical engineering, Agnes
E. Ostafin, assistant professor
of chemical and biomolecular
engineering; Jeffrey W. Talley, assistant professor
of civil engineering and geological sciences; and George Hornberger, the Ernest
H. Ern Professor of Environmental
Sciences at the University of Virginia.
“The Naiades project,” says Maurice, “has the potential of
greatly enhancing our knowledge of the hydrologic cycle, water quality, pollution,
the
potential effects of microorganisms, and even biological warfare. It’s
an innovative solution to building better environmental models so we can better
understand our world and what impacts it.”
Current technology dictates that a researcher seeking to understand the physicochemical
reactions that occur in a lake or stream has to either collect samples -- physically
go to the lake or stream, gather water, and take it back to a lab for testing
-- or set up a commercial sensor in the water to record variables in things like
pH or conductivity. The trouble has been that the real world involves a variety
of spatial and temporal scales not addressed by these testing methods. Although
researchers gather samples under a variety of conditions, they do not normally
collect data during sub-zero temperatures or thunderstorms. In addition, even
the most accurate commercial sensors have been limited in the number of samples
or amount of information they could record or process.
Naiades will
differ from current technologies in two very important ways. First, the system
will be an internet of control area networks connected through wireless gateways
that link simple sensors -- measuring things like temperature, conductivity,
turbidity, flow, and ambient light -- to bacterial sensors and bulk water samplers,
which will measure major cations, anions, metals, and pesticides. Secondly, the
system will feature underwater nodes and surface base stations, each with an
embedded computer. The wireless ad hoc network formed by the base stations, the
Naiades subnet, will be able to automatically reconfigure routing pathways based
upon the local analysis performed by the sensors, individually and collectively.
Information gathered by the system could be used for immediate needs, such as
issuing alerts to the appropriate agencies of increased E. coli levels in beach
areas or for long-term research projects. Field tests, scheduled to begin in
year three of the project, will focus on detecting, forecasting, and monitoring
storm events and diel (day/night) fluctuations.
The Naiades project also includes several educational objectives. A learning
module will be developed for the University’s first-year engineering course
sequence. Information from the project will also be incorporated into the curriculum
of CE 498/598: Introduction to Environmental Engineering and Science and
a graduate course on water-rock interactions. Graduate students involved in the
project will participate in a one-credit-hour interdisciplinary special topics
course to be taught by project faculty. And, an interdisciplinary workshop on
environmental sensors
will be held on campus during the final year of the project.
Perhaps one of the most attractive elements of this interdisciplinary effort
is that researchers will not have to travel far to find a natural laboratory
in which to test the system they are creating. The Naiades system will first
be tested in the two lakes on campus, St. Mary’s and St. Joseph’s,
in order to develop accurate predictive models of algal blooms, an important
environmental question that would benefit from the high-resolution, real-time
data collection offered via the Naiades system. |
Throughout the course of their studies, undergraduates
in the Department of Aerospace and Mechanical Engineering learn how
to design aircraft. As important, they learn how to design and build
a series of microcontrollers — tiny embedded systems operated
by rechargeable batteries — that features a global positioning
system, accelerometers, pressure transducers, thermocouples, an analog-to-digital
converter, and a transmitter. The purpose of designing these microprocessors
is two-fold: to introduce undergraduates to the interdisciplinary
nature of engineering today via the building block of all mechatronic
systems and to address real-world applications. This is particularly
important, says graduate student Thomas R. Szarek, “because
digital processors are finding their way into more and more, and
smaller and smaller, technologies.”
According to Thomas C. Corke, the Clark Equipment Professor
of Aerospace and Mechanical Engineering, there is an increasing need for remote
controlled aircraft, particularly for data collection. “The obvious need
is a military one for reconnaissance and tracking, such as the drone planes that
flew over Iraq. By using remote piloted aircraft for these types of missions,
human lives were not put at risk,” says Corke.
"But there’s also a lot of interest in using these autonomous aircraft
as environmental monitors,” he says. In fact, one of Corke’s students
is conversing with the forestry service in Florida about the possibility of using
a remote piloted plane to follow migratory animals. The embedded system in such
a vehicle could trace the paths of animals tagged with radar transmitters, but
it could also track them visually via an embedded pattern recognition program.
In addition, these aircraft could be used to measure air and water quality. And,
using infrared sensors, they could monitor thermal pollution. “The idea,” says
Corke, “is that all the information is gathered by the embedded system
and then transmitted to a receiver on the ground. It’s less expensive than
sending up manned flights, and, because of that, it would be possible to operate
more aircraft, cover larger areas, and collect more data.”
Thomas R. Szarek, a graduate student in the Department
of Aerospace and Mechanical Engineering, loads a student designed
microcontroller-based system featuring two sensors into a model rocket.
Using the microcontroller, undergraduates in the department can measure
the acceleration and velocity of the rocket as it is launched and
determine its final height. Szarek, working with Professor Patrick
F. Dunn and Thomas C. Corke, the Clark Equipment Professor of Aerospace
and Mechanical Engineering, has developed the rocket project in order
to focus on the use of embedded systems for data acquisition. Undergraduates
build on this project and the concept of using microcontroller-based
systems throughout their studies with an effort culminating in AME
441: Senior Design, when they design and build an airplane and program
it to fly autonomously. |
|
|
Unfortunately, the solution
-- using embedded systems to better monitor the real world -- is not
as straightforward as it seems. “Embedded processors and their
proliferate use, such as the development of the Naiades project,” says
Antsaklis, “is driven by the fact that we are able to cheaply manufacture
these devices. But, you cannot simply set out a group of processors and
expect them to act together in a coherent fashion in relationship to
the real world. It simply will not happen without a tremendous amount
of planning and a detailed
understanding of hybrid dynamical systems.”
According to Antsaklis, when a system is distributed, so is the information.
No single node contains all the information, and no single node acts as the command
center. “The traditional notions of communications are challenged,” he
says. “One of the first considerations in a network is to establish a path
along which the nodes communicate. In addition, there is a lot of protocol software
that needs to be written or refined to ensure that the processors are synchronized
with one another and with the real world. And, finally, because they are out
in the real world ... some of them miles away from a power source ... they need
to be able to operate on small batteries or solar power.” These are some
of the issues being addressed by the Naiades team.
When they are successful in developing these intelligent sensors and flexible
embedded systems, they will have made a quantum leap in environmental monitoring.
This knowledge can then be applied to defense systems, to health monitoring,
to the coordination of satellites or traffic systems ... the list is endless.
But, the change is inevitable. The novel ways University researchers are employing
networked embedded systems to collect data will usher in improvements to the
way skyscrapers are designed, aircraft are built, and the environment is monitored.
These changes may not inspire a 21st-century Sandburg or Thoreau to wax eloquent
about the nodes, motes, or actuators, but the changes which will be implemented
from the
information gained will help build a better world.
For more information on networked embedded systems
technology and research in
the College of Engineering, visit:
Center for Environmental Science and Technology
http://www.nd.edu/~cest/
Department of Aerospace and Mechanical Engineering
http://www.nd.edu/~ame/
Department of Electrical Engineering
http://xml.ee.nd.edu
NATHAZ Modeling Laboratory
http://www.nd.edu/~nathaz/
http://windycity.ce.nd.edu/ |
|
|