Skip to main content

At the UW, our sci-fi future has arrived

Current projects in ECE show that some sci-fi literary fantasies will soon be reality.

Learn More

At the UW, our sci-fi future has arrived Banner

UW-led philosophy team receives $1.5M grant to study the ethics of neurotechnology research

Do brain-computer interfaces affect whether patients feel they are in charge of their own actions?

Learn More

UW-led philosophy team receives $1.5M grant to study the ethics of neurotechnology research Banner

Professor Fazel to lead NSF TRIPODS+X in data science

Project will enhance the successful "hack week" model

Learn More

Professor Fazel to lead NSF TRIPODS+X in data science Banner

Scientists engineer a functional optical lens out of 2D materials

Lenses are as thin as 190 nanometers — less than 1/100,000ths of an inch thick

Learn More

Scientists engineer a functional optical lens out of 2D materials Banner

Repairing damaged nerves

Professor Chet Moritz is leading an NSF-funded study to reconnect severed nerves

Learn More

Repairing damaged nerves Banner

ECE Professor Brian Johnson leads Department of Energy-funded research

The multi-institutional team includes UW ECE professor Daniel Kirschen

Learn More

ECE Professor Brian Johnson leads Department of Energy-funded research Banner

News + Events

https://www.ece.uw.edu/spotlight/at-the-uw-our-sci-fi-future-has-arrived/
https://www.ece.uw.edu/spotlight/ostendorf-wins-2018-flanagan-award/
https://www.ece.uw.edu/spotlight/uw-led-philosophy-team-receives-1-5m-grant-to-study-the-ethics-of-neurotechnology-research/
https://www.ece.uw.edu/spotlight/fazels-research-in-data-science-helps-secure-nsf-tripodsx-grants/
https://www.ece.uw.edu/spotlight/scientists-engineer-a-functional-optical-lens-out-of-2d-materials/
https://www.ece.uw.edu/spotlight/repairing-damaged-nerves/
626uweeViewNews Object
(
    [_showAnnouncements:protected] => 
    [_showTitle:protected] => 
    [showMore] => 
    [_type:protected] => spotlight
    [_from:protected] => newsawards_landing
    [_args:protected] => Array
        (
            [post_type] => spotlight
            [meta_query] => Array
                (
                    [0] => Array
                        (
                            [key] => type
                            [value] => news
                            [compare] => LIKE
                        )

                )

            [posts_per_page] => 6
            [post_status] => publish
        )

    [_jids:protected] => 
    [_taxa:protected] => Array
        (
        )

    [_meta:protected] => Array
        (
            [0] => Array
                (
                    [key] => type
                    [value] => news
                    [compare] => LIKE
                )

        )

    [_metarelation:protected] => AND
    [_results:protected] => Array
        (
            [0] => WP_Post Object
                (
                    [ID] => 13936
                    [post_author] => 22
                    [post_date] => 2018-12-13 15:58:40
                    [post_date_gmt] => 2018-12-13 23:58:40
                    [post_content] => 

By Hannelore Sudermann From Columns

Late this summer a group of astronomers from around the country, including assistant professor Rory Barnes, discovered what could be Vulcan, Mr. Spock’s home planet.

It is right where “Star Trek” creator Gene Roddenberry said it could be—in a solar system surrounding 40 Eridani A, a star 16 light years across the final frontier. “I hadn’t even realized the planet might be Vulcan until someone brought it up after the paper about the planet ’s discovery was published,” says Barnes, who was part of a team working on the Dharma Planet Survey to detect potentially habitable super-Earth planets in other solar systems. It’s findings like this, as well as the fast-changing and increasing role of technology in our contemporary lives—from smartphones to personal drones—that can make us feel like we’re living in a science fictional future. A sci-fi world is no longer something we have to imagine—it’s as close as a stroll on a UW campus. In the Forestry Building, a scientist is trying to figure out how to grow broccoli on Mars. Just down the road, a team is storing massive amounts of data in molecules of DNA. And over in Mechanical Engineering, students on the UW Hyperloop team are building a pod that can travel several hundred miles per hour in a vacuum tube. In the Paul G. Allen Center for Computer Science & Engineering, robotics professor Siddhartha Srinivasa and his lab are working on a Home Exploring Robot Butler (HERB). The service robot that can perform a range of chores has already been featured in National Geographic and Wired. Now HERB is bringing the worlds of science and sci-fi even closer with a recent appearance on an episode of “The X-Files.” In a nearby classroom, Howard Chizeck, an electrical engineering professor whose research includes electronically stimulating the human brain to manage movement disorders, is driving the focus of a freshman class straight to the crossroads of fiction and science. Titled “Indistinguishable from Magic: New Technologies, Science Fiction, and Us,” the course covers writers like Isaac Asimov and Alice B. Sheldon as well as current technology and the potential cultural changes it may trigger. Chizeck sets the table for a feast of science that owes a debt to science fiction.

Lots and lots of scientists and engineers read science fiction and if they see it and like it, they try to do it.

HOWARD CHIZECK, ELECTRICAL ENGINEERING PROFESSOR

“This is not a normal course,” Chizeck tells the room of new students, “This is an electrical engineering course, and I will offer some of that. But it is also a science fiction course.” He goes on to explain that most of us now use amazingly powerful electronic objects, yet we have little understanding of how they work or how they’ve changed us. How do science fiction, technology and society influence and impact one another, he asks the students. What are the profound ramifications of the technologies we’re embracing? Having been in computer science since the beginning of computers, Chizeck has had a close-up view of how technology has evolved and how it has, in turn, changed society. He has also kept an eye on how fiction writers have responded, followed and sometimes led. “When writers are writing science fiction, they’re writing for the society they’re in,” he says. “At the same time, the science fiction they write has changed society.” That first class touches on the birth of the internet as well as the invention of the first cellphone in 1973 and its evolution to a product for everyone by the 1990s. They also talk about Bisphenol A (BPA) plastics used in water bottles and food cans being linked to infertility. Chizeck then points to the prescient, futuristic focus of the 1985 book “The Handmaid’s Tale,” which is built around a fundamentalist government, fertility challenges, and women being treated as property of the state. He circles that point around to the current TV show based on the book and the handmaids’ costumes worn by protesters wanting to draw attention to the government’s neglect and abuse of women’s rights. “Good science fiction has an understanding of the real,” he tells the class. It also helps us imagine the future so we can explore the benefits and the harm that our inventions and discoveries can bring. Fiction lives within our culture, he says. For example, writer Philip K. Dick (“Blade Runner” and “Total Recall”) imagined autonomous vehicles, virtual reality, and insects outfitted with sensors. “It was like he could see into the future,” says Chizeck. “Lots and lots of scientists and engineers read science fiction and if they see it and like it, they try to do it,” he says. He points to the now-classic Motorola flip phone. “That’s based on the communicator straight out of ‘Star Trek.’” A few weeks after that first class, Chizeck invites Hugo and Nebula award-winning sci-fi writer Nancy Kress to speak to the class. She is one of a group of stellar sci-fi writers including Octavia Butler, William Gibson, Ted Chiang, Cat Rambo and Ursula K. Le Guin, who made the Northwest their home. Perched on a table at the front of the room, Kress spells out a range of ideas from machine learning, genetic engineering, conversational AI and fire-resistant wallpaper with thermal-sensor nanowires that allow it to serve as a fire alarm. “Any technology is a tool,” Kress tells the students. “It is going to have good outcomes and bad outcomes.” What are some of the downsides? The students are ready with answers: people are losing their jobs to machines, plastic guns that can now be made by anyone with plans and a 3D printer, and loss of privacy and hacking. Kress is one of a group of sci-fi writers invited to Microsoft’s research labs to see the projects and then write science fiction stories featuring technology we may be using in the near future. Her story “Machine Learning” plays with a human working with holographic projections (think Princess Leia in “Star Wars”). The talk with the class then turns to her short story “Nano Comes to Clifford Falls,” the tale of a small town and the arrival of nanomachines that make whatever anyone desires—food, clothes, even cars and homes. The story is told from the point of view of a skeptical single mother reluctant to bring the “made” products into her home. Because the tech brings them nearly everything, most of the townspeople stop working, the community stops functioning and ultimately, it collapses. “I wanted to focus on the human element,” Kress says. “What can go wrong, what can go right?” The uses of technology might not be exactly what the designers and engineers were imagining. As William Gibson explains, “The street finds its own uses for things—uses the manufacturers never even imaged.” Facebook, for example, was not built for Russian bots, says Kress.

It’s an incredibly hard problem, but I’d like the robot to be able to pick up a zucchini as well as twirl spaghetti.

SIDDHARTHA SRINIVASA, ROBOTICS PROFESSOR

While sci-fi writers are inspired by the possibilities that come with new scientific discoveries and technological inventions, inventors, scientists and engineers—including many at the UW—are inspired by those stories. “I grew up reading science fiction,” says Srinivasa, a nationally-known robotics expert who moved his entire lab from Carnegie Mellon University to the UW in 2017 and who just this fall joined Amazon as the company’s robotics director. “When I was six years old, I spent the summer in my grandparents’ house. My granddad’s bookshelf had Ray Bradbury’s ‘The Martian Chronicles.’ While I struggled with a lot of the words, it was the most fascinating book that I had ever read. Every year I would go back and pull it out and read it again. I still read it.” That 1950 book of short stories focuses on the efforts of humans to reach and colonize Mars, which is already home to another race of beings. Each time Srinivasa reads it, he finds more meaning, and new fodder for his imagination. “That’s a hallmark of great science fiction,” he says. “It envisions a future and challenges us into thinking how to achieve that. Everything that I have done has really stemmed from that day I discovered the book. “I think science fiction has played a really integral part in robotics and computer science.” Different cultures have accepted robots and technology in different ways. “Japan for example—after the Second World War, Astro Boy was this superhuman robot boy. He came out and did great things. I think perception of robots and technology in Japan has been forever colored by that cartoon.” By contrast, he notes, in the United States, many of us grew up watching “Terminator” (where a cyborg assassin comes from the future to kill.) “Here, the future is much more dystopian than pleasant,” Srinivasa says. “I think the truth of our future lies somewhere in between.” Srinivasa has been building robots for 19 years. Early on he started thinking about how and where he would want his robots to work. If it’s on a factory floor, the robot can do its task in isolation. “In that setting it should just be efficient,” he says. “But if you want it to work around people, you can either treat people as obstacles that should be avoided or you can make it a collaborative approach.” The robots he’s building are to be programmed to respond to human behavior, predict their intentions and harmonize with their actions. He offers the example of two people putting together a chair from Ikea. “It is a delicate social dance of deciding what goes where and who does what and what to pick up,” he says. “I want robots to participate in this dance, this discourse. Robots need to understand how humans work.” He helps them understand through mathematics, the mathematics of humans as dynamical systems. Latent feelings leak through our actions and interactions, he explains. These are easy things for humans to read, but very challenging for a robot. He’s working on mathematical models that will help a robot understand “the things we understand about others so intuitively. And that’s not easy.” His goal is to develop robots for assistive care. “I want to build robots that can actually help people,” he says. Wanting to assist people with limited mobility, he and his students are refining a robot that helps you eat. “It’s an incredibly hard problem, but I’d like the robot to be able to pick up a zucchini as well as twirl spaghetti,” he says. So how long until we have robots in our homes? “I have a few answers to that,” Srinivasa says. “We seem to think it’s nothing, nothing, nothing and suddenly we have Rosie the Robot [the robot maid from 1960s animated sitcom “The Jetsons”] in our home. But really, we’ll have technologies along the way: the thermostat, the smart refrigerator, the vacuum cleaner. All these technologies are building the ecosystem for the robot. It will talk to your microwave. It will talk to your fridge. It’s not just one robot, but an ecosystem of robots. “I think that oftentimes when we say we want to build Rosie the Robot, we forget why that robot was created. We focus on the technology and say it would be cool to have arms and eyes. But we should remember that robot was created for caregiving. We need to go back and think about why we create the technology. How can we build caregiving technology that actually gives people care?” We venture down a few floors from his office to his lab, where dozens of computer screens stare out from tables that pack the room. At one end, a black-curtained area houses ADA (Assistive Dexterous Arm), a robotic arm designed to attach to a wheelchair and assist with tasks like eating. Nearby, a student reviews a video of ADA feeding celery to a person. The robot picks up the food item and holds it for the woman to eat. The woman laughs because ADA is holding the celery upright. In order to take a bite, she has to turn her head 45 degrees. It’s not ideal, but it’s helping ADA learn there might be a better way to hold the food. HERB, another of Srinivasa’s robots, is not quite as tall as most human adults. He has two long multi-jointed arms with three-fingered hands and he rolls around on wheels. One day he could be unloading a dishwasher or cracking open an Oreo and another he may be performing in some sci-fi movie or TV show. In March, the robot butler made an appearance on the “The X-Files” as a worker at a sushi restaurant. The episode’s theme was technology turning on people. When Mulder decides not to leave a tip, the automated restaurant and the tech outside—smartphones, drones and even a home security system—retaliate.

Either we become aware that Earth is the only home for life in the universe or that there’s some galactic society out there and we probably want to be a part of it.

RORY BARNES, ASTRONOMY ASSISTANT PROFESSOR

As a kid in Tucson, where he discovered a love for the night sky, Barnes watched “Star Trek.” He found the show more entertaining than inspiring. He admits that though he has spent his academic life steeped in science, he isn’t much one for science fiction. “My tastes tended toward real science,” he admits. But he has so much in common with the show—which is all about exploring “strange, new words” and seeking out “new life.” Barnes’ specialty is exoplanets—planets in far away solar systems that might support life, long the subject of sci-fi writers like Edgar Rice Burroughs, whose Mars (also known as Barsoom) was home to a Martian race, and Frank Herbert (a Tacoma native who attended the UW for a time) whose “Dune” planet of Arrakis was covered in desert. Only for Barnes, life out there in the universe is now far more likely than those writers could have known. Twenty-five years ago, we didn’t really know anything of planets outside our solar system, says Barnes. But then in 1995, the first exoplanet was discovered. It is now called a “hot Jupiter,” a gas giant with a short orbital period. “The planet, it turns out, was somewhat unusual. It was just that our technology finally allowed us to see it,” Barnes says. After that, the floodgates opened and thousands of potential exoplanets—including the one in orbit around 40 Eridani A—have been identified. “We still don’t know if there are really any habitable worlds out there at all,” Barnes says. “But as Isaac Asimov said, either way, the answer to the question ‘Are we alone?’ is staggering,” he says. “Either we become aware that Earth is the only home for life in the universe or that there’s some galactic society out there and we probably want to be a part of it.” Our interview over, I left the Physics and Astronomy Building and started down a long flight of stairs to 15th Avenue N.E. I chanced to look up at the view in front of me. There was the Space Needle. It was a trick of perspective, but it appeared to hover over the north end of Capitol Hill. I descended a few more steps, and it landed, disappearing into the neighborhood.
[post_title] => At the UW, our sci-fi future has arrived [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => at-the-uw-our-sci-fi-future-has-arrived [to_ping] => [pinged] => [post_modified] => 2018-12-13 15:58:40 [post_modified_gmt] => 2018-12-13 23:58:40 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13936 [menu_order] => 1 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [1] => WP_Post Object ( [ID] => 13927 [post_author] => 22 [post_date] => 2018-05-10 11:44:48 [post_date_gmt] => 2018-05-10 18:44:48 [post_content] => Professor Mari Ostendorf won the 2018 IEEE James L. Flanagan Speech and Audio Processing Award for her innovative research and improving spoken language technology. "Dr. Ostendorf's extensive contributions to natural language processing and speech technology has been playing a major part in conquering the limitations of spoken language technology," said Radha Poovendran, professor and chair of electrical & computer engineering. "She is a pioneer in interactive conversational AI, a multi-billion dollar industry." [post_title] => Ostendorf wins 2018 Flanagan award [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => ostendorf-wins-2018-flanagan-award [to_ping] => [pinged] => [post_modified] => 2018-12-10 13:17:50 [post_modified_gmt] => 2018-12-10 21:17:50 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13927 [menu_order] => 2 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [2] => WP_Post Object ( [ID] => 13870 [post_author] => 22 [post_date] => 2018-11-28 15:18:12 [post_date_gmt] => 2018-11-28 23:18:12 [post_content] => By Sarah McQuate UW News [caption id="attachment_13872" align="alignleft" width="300"]Ivana Milovanovic UW postdoc Ivana Milovanovic (left) works with Center for Neurotechnology Young Scholars Program participant Emily Boeschoten on a sensory device.Mark Stone/University of Washington[/caption] Brain-computer interfaces have the potential to give patients better and more natural control over their prosthetic devices. Through this method, a chip in a patient’s brain picks up a thought — neural activity triggered by focusing on specific visual imagery — to move a joint and then transmits that signal to the prosthetic. This technology is not widely available yet, but as it progresses through research trials, ethical questions are emerging about users’ sense of control over their own actions. For example: Who is responsible if a prosthetic limb malfunctions and strikes someone in a crowd — the patient or the device? To address these types of questions, University of Washington researchers in the Center for Neurotechnology, including ECE's professor Howard Chizeck, are studying how brain-computer interfaces affect whether patients feel they are in charge of their own actions. For this research, the team will receive $1.5 million from the National Institutes of Health over the next four years. “Neuroscience offers a deeper understanding of the brain and gives us the prospect of new ways to treat diseases or affect how the brain functions,” said Sara Goering, a UW associate professor of philosophy and the team lead for the project. “Given how closely we associate the function of our brains with who we are as individuals, it is valuable to explore the implications of this research. Then people can better understand their options before enrolling in a study, and researchers can design devices that better suit users’ needs and values.” [caption id="attachment_13871" align="alignright" width="300"]Tim Brown Tim Brown, a doctoral student in the UW’s philosophy department and a researcher involved in the Center for Neurotechnology project, is already embedded in the UW’s BioRobotics Lab. He studies autonomy issues that arise for people with Parkinson’s disease or essential tremor when they use deep brain stimulation to manage their symptoms.Mark Stone/University of Washington[/caption] The team aims to examine multiple types of brain-computer interfaces that currently are being tested in clinical studies, not just those that control prosthetics. In using deep brain stimulation to treat Parkinson’s disease or depression, a patient might wonder: Is my action the result of something I did, or something the stimulator did? And with devices that help patients sense touch, a patient might ask: Is this interface correctly telling me how hard I am squeezing someone’s hand? By looking at how these devices affect the degree to which patients sense they are in control of their own actions and emotions, the researchers hope to develop tools that can help future patients feel better equipped to remain in control. “We will start this process by ’embedding’ ethicists within neural engineering labs that are studying different brain-computer interfaces,” said Dr. Eran Klein, an affiliate assistant professor of philosophy at the UW and an assistant professor of neurology at Oregon Health & Science University who is the co-leader on this project. “These ethicists will work side by side with the researchers in the lab, as well as interview both researchers and research participants about their perspectives on brain-computer interfaces.” The grant will fund one or two new researchers who will work in different labs over the course of the project. Some of these labs are part of the Center for Neurotechnology, which is based at the UW, while others are at Massachusetts General Hospital, University of Freiburg in Germany, University of Utrecht in the Netherlands, Brown University and Caltech. After compiling perspectives across all labs, the team will develop a series of questions to give to future patients who are considering enrolling in a study to receive a brain-computer interface. They can use these questions to better prepare for informed consent discussions with researchers, Goering said. “We’re hoping that the close attention we pay to users’ experiences operating brain-computer interfaces will help us understand how to help prospective users be informed about the tradeoffs they might be making,” she said. “In addition, we also want to help researchers in the field think carefully about next-generation device design, so that this technology will maintain or enhance a user’s sense of control.” [post_title] => UW-led philosophy team receives $1.5M grant to study the ethics of neurotechnology research [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => uw-led-philosophy-team-receives-1-5m-grant-to-study-the-ethics-of-neurotechnology-research [to_ping] => [pinged] => [post_modified] => 2018-11-28 15:19:08 [post_modified_gmt] => 2018-11-28 23:19:08 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13870 [menu_order] => 3 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [3] => WP_Post Object ( [ID] => 13849 [post_author] => 22 [post_date] => 2018-11-26 10:18:47 [post_date_gmt] => 2018-11-26 18:18:47 [post_content] => The National Science Foundation announced that it is awarding grants totaling $8.5 million to 19 collaborative projects at 23 universities for the study of complex and entrenched problems in data science. Three of these projects will be based at the University of Washington. The grants build on a 2017 award in the Transdisciplinary Research in Principles of Data (TRIPODS) program, which led to the founding of the Algorithmic Foundations of Data Science Institute (ADSI) at UW. ADSI focuses on developing theoretical and algorithmic tools bridging mathematics, computer science, and statistics for addressing contemporary data science challenges. The new grants make up the TRIPODS+X program, which expands these big-data projects into broader areas of science, engineering, and mathematics. Electrical & computer engineering associate professor Maryam Fazel, who co-directs ADSI, is the lead researcher in “Foundational Training in Neuroscience and Geoscience via Hack Weeks.” [caption id="attachment_13851" align="alignleft" width="300"]NeuroHackademy Students at the NeuroHackademy.[/caption] “Our training will be designed to bridge the gap between state-of-the-art foundational research in mathematical and algorithmic tools, with scientific applications, which is much in spirit of the TRIPODS+X program,” Fazel said. Her TRIPODS+X project will enhance the successful “hack week” model: Hack weeks blend elements of traditional lecture-style pedagogy and participant-driven projects. “In the new grant, we will leverage a new and exciting model for education and collaboration, namely the hack-weeks model, a format that UW's own eScience institute has done pioneering work on developing,” Fazel said. “The project is exciting because we get to connect our theory and algorithms to two important scientific domains, neuroscience and geoscience, which in recent years have had access to massive data sets and where data science tools are poised to help make breakthroughs.” Fazel will lead a multi-disciplinary team of UW researchers on this project, which includes Ariel Rokem at the eScience Institute; Anthony Arendt at the Applied Physics Laboratory; Aleksandr Aravkin, assistant professor of applied mathematics; and Zaid Harchaoui, assistant professor of statistics and eScience Institute fellow. Fazel is also a researcher on another TRIPOD+X project, “Safe Imitation Learning for Robotics,” which is led by Harchaoui. This project will focus on developing mathematically rigorous algorithms for imitation learning in robotics, a form of learning in which a robotic system learns through demonstration.         [post_title] => Professor Fazel to lead NSF TRIPODS+X in data science [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => fazels-research-in-data-science-helps-secure-nsf-tripodsx-grants [to_ping] => [pinged] => [post_modified] => 2018-11-28 13:24:23 [post_modified_gmt] => 2018-11-28 21:24:23 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13849 [menu_order] => 4 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [4] => WP_Post Object ( [ID] => 13839 [post_author] => 22 [post_date] => 2018-11-16 15:11:40 [post_date_gmt] => 2018-11-16 23:11:40 [post_content] => In optics, the era of glass lenses may be waning. In recent years, physicists and engineers have been designing, constructing and testing different types of ultrathin materials that could replace the thick glass lenses used today in cameras and imaging systems. Critically, these engineered lenses — known as metalenses — are not made of glass. Instead, they consist of materials constructed at the nanoscale into arrays of columns or fin-like structures. These formations can interact with incoming light, directing it toward a single focal point for imaging purposes. But even though metalenses are much thinner than glass lenses, they still rely on “high aspect ratio” structures, in which the column or fin-like structures are much taller than they are wide, making them prone to collapsing and falling over. Furthermore, these structures have always been near the wavelength of light they’re interacting with in thickness — until now. In a paper published Oct. 8 in the journal Nano Letters, a team from the University of Washington and the National Tsing Hua University in Taiwan announced that it has constructed functional metalenses that are one-tenth to one-half the thickness of the wavelengths of light that they focus. Their metalenses, which were constructed out of layered 2D materials, were as thin as 190 nanometers — less than 1/100,000ths of an inch thick. [caption id="attachment_13840" align="alignright" width="439"]Metalens prototype Four ultrathin metalenses developed by University of Washington researchers and visualized under a microscope[/caption] “This is the first time that someone has shown that it is possible to create a metalens out of 2D materials,” said senior and co-corresponding author Arka Majumdar, a UW assistant professor of physics and of electrical and computer engineering. Their design principles can be used for the creation of metalenses with more complex, tunable features, added Majumdar, who is also a faculty researcher with the UW’s Molecular Engineering & Sciences Institute and Institute for Nano-Engineered Systems. Majumdar’s team has been studying the design principles of metalenses for years, and previously constructed metalenses for full-color imaging. But the challenge in this project was to overcome an inherent design limitation in metalenses: in order for a metalens material to interact with light and achieve optimal imaging quality, the material had to be roughly the same thickness as the light’s wavelength in that material. In mathematical terms, this restriction ensures that a full zero to two-pi phase shift range is achievable, which guarantees that any optical element can be designed. For example, a metalens for a 500-nanometer lightwave — which in the visual spectrum is green light — would need to be about 500 nanometers in thickness, though this thickness can decrease as the refractive index of the material increases. Majumdar and his team were able to synthesize functional metalenses that were much thinner than this theoretical limit — one-tenth to one-half the wavelength. First, they constructed the metalens out of sheets of layered 2D materials. The team used widely studied 2D materials such as hexagonal boron nitride and molybdenum disulfide. A single atomic layer of these materials provides a very small phase shift, unsuitable for efficient lensing. So the team used multiple layers to increase the thickness, although the thickness remained too small to reach a full two-pi phase shift. “We had to start by figuring out what type of design would yield the best performance given the incomplete phase,” said co-author Jiajiu Zheng, a doctoral student in electrical and computer engineering. To make up for the shortfall, the team employed mathematical models that were originally formulated for liquid-crystal optics. These, in conjunction with the metalens structural elements, allowed the researchers to achieve high efficiency even if the whole phase shift is not covered. They tested the metalens’ efficacy by using it to capture different test images, including of the Mona Lisa and a block letter W. The team also demonstrated how stretching the metalens could tune the focal length of the lens. In addition to achieving a wholly new approach to metalens design at record-thin levels, the team believes that its experiments show the promise of making new devices for imaging and optics entirely out of 2D materials. “These results open up an entirely new platform for studying the properties of 2D materials, as well as constructing fully functional nanophotonic devices made entirely from these materials,” said Majumdar. Additionally, these materials can be easily transferred on any substrate, including flexible materials, paving a way towards flexible photonics. The lead and co-corresponding author on the paper is Chang-Hua Liu, who began this work as a UW postdoctoral researcher and is now a faculty member at the National Tsing Hua University in Taiwan. Additional co-authors are doctoral students Shane Colburn, Taylor Fryett and Yueyang Chen in the Department of Electrical and Computer Engineering; and Xiaodong Xu, a UW professor of physics and of materials science and engineering. The team’s prototype metalenses were all built at the Washington Nanofabrication Facility, a National Nanotechnology Coordinated Infrastructure site on the UW campus. The research was funded by the U.S. Air Force Office of Scientific Research, the National Science Foundation, the Washington Research Foundation, the M.J. Murdock Charitable Trust, GCE Market, Class One Technologies and Google. Thanks to the UW News Staff for this article.  [post_title] => Scientists engineer a functional optical lens out of 2D materials [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => scientists-engineer-a-functional-optical-lens-out-of-2d-materials [to_ping] => [pinged] => [post_modified] => 2018-11-16 15:13:42 [post_modified_gmt] => 2018-11-16 23:13:42 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13839 [menu_order] => 5 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [5] => WP_Post Object ( [ID] => 13828 [post_author] => 22 [post_date] => 2018-11-15 11:24:08 [post_date_gmt] => 2018-11-15 19:24:08 [post_content] => ECE professor Chet Moritz, whose research is in the Restorative Technology Laboratory, is improving brain and spinal cord injuries. Currently, his research, funded by the National Science Foundation has had several breakthroughs using electrical stimulation to damaged nerves. Learn more here.  [post_title] => Repairing damaged nerves [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => repairing-damaged-nerves [to_ping] => [pinged] => [post_modified] => 2018-11-15 11:24:08 [post_modified_gmt] => 2018-11-15 19:24:08 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13828 [menu_order] => 6 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) ) [_numposts:protected] => 6 [_rendered:protected] => 1 [_classes:protected] => Array ( [0] => view-block [1] => block--spotlight-robust-news ) [_finalHTML:protected] =>
https://www.ece.uw.edu/spotlight/at-the-uw-our-sci-fi-future-has-arrived/
https://www.ece.uw.edu/spotlight/ostendorf-wins-2018-flanagan-award/
https://www.ece.uw.edu/spotlight/uw-led-philosophy-team-receives-1-5m-grant-to-study-the-ethics-of-neurotechnology-research/
https://www.ece.uw.edu/spotlight/fazels-research-in-data-science-helps-secure-nsf-tripodsx-grants/
https://www.ece.uw.edu/spotlight/scientists-engineer-a-functional-optical-lens-out-of-2d-materials/
https://www.ece.uw.edu/spotlight/repairing-damaged-nerves/
[_postID:protected] => 184 [_errors:protected] => Array ( ) [_block:protected] => [_db:protected] => WP_Query Object ( [query] => Array ( [post_type] => spotlight [meta_query] => Array ( [0] => Array ( [key] => type [value] => news [compare] => LIKE ) ) [posts_per_page] => 6 [post_status] => publish ) [query_vars] => Array ( [post_type] => spotlight [meta_query] => Array ( [0] => Array ( [key] => type [value] => news [compare] => LIKE ) ) [posts_per_page] => 6 [post_status] => publish [error] => [m] => [p] => 0 [post_parent] => [subpost] => [subpost_id] => [attachment] => [attachment_id] => 0 [name] => [static] => [pagename] => [page_id] => 0 [second] => [minute] => [hour] => [day] => 0 [monthnum] => 0 [year] => 0 [w] => 0 [category_name] => [tag] => [cat] => [tag_id] => [author] => [author_name] => [feed] => [tb] => [paged] => 0 [meta_key] => [meta_value] => [preview] => [s] => [sentence] => [title] => [fields] => [menu_order] => [embed] => [category__in] => Array ( ) [category__not_in] => Array ( ) [category__and] => Array ( ) [post__in] => Array ( ) [post__not_in] => Array ( ) [post_name__in] => Array ( ) [tag__in] => Array ( ) [tag__not_in] => Array ( ) [tag__and] => Array ( ) [tag_slug__in] => Array ( ) [tag_slug__and] => Array ( ) [post_parent__in] => Array ( ) [post_parent__not_in] => Array ( ) [author__in] => Array ( ) [author__not_in] => Array ( ) [orderby] => menu_order [order] => ASC [ignore_sticky_posts] => [suppress_filters] => [cache_results] => 1 [update_post_term_cache] => 1 [lazy_load_term_meta] => 1 [update_post_meta_cache] => 1 [nopaging] => [comments_per_page] => 50 [no_found_rows] => ) [tax_query] => WP_Tax_Query Object ( [queries] => Array ( ) [relation] => AND [table_aliases:protected] => Array ( ) [queried_terms] => Array ( ) [primary_table] => wp_posts [primary_id_column] => ID ) [meta_query] => WP_Meta_Query Object ( [queries] => Array ( [0] => Array ( [key] => type [value] => news [compare] => LIKE ) [relation] => OR ) [relation] => AND [meta_table] => wp_postmeta [meta_id_column] => post_id [primary_table] => wp_posts [primary_id_column] => ID [table_aliases:protected] => Array ( [0] => wp_postmeta ) [clauses:protected] => Array ( [wp_postmeta] => Array ( [key] => type [value] => news [compare] => LIKE [alias] => wp_postmeta [cast] => CHAR ) ) [has_or_relation:protected] => ) [date_query] => [request] => SELECT SQL_CALC_FOUND_ROWS wp_posts.ID FROM wp_posts INNER JOIN wp_postmeta ON ( wp_posts.ID = wp_postmeta.post_id ) WHERE 1=1 AND ( ( wp_postmeta.meta_key = 'type' AND wp_postmeta.meta_value LIKE '{49de564baf1042587c0dd3190892b78a3b9701289bc013de5f859b7a2cf08ac8}news{49de564baf1042587c0dd3190892b78a3b9701289bc013de5f859b7a2cf08ac8}' ) ) AND wp_posts.post_type = 'spotlight' AND ((wp_posts.post_status = 'publish')) GROUP BY wp_posts.ID ORDER BY wp_posts.menu_order ASC LIMIT 0, 6 [posts] => Array ( [0] => WP_Post Object ( [ID] => 13936 [post_author] => 22 [post_date] => 2018-12-13 15:58:40 [post_date_gmt] => 2018-12-13 23:58:40 [post_content] =>

By Hannelore Sudermann From Columns

Late this summer a group of astronomers from around the country, including assistant professor Rory Barnes, discovered what could be Vulcan, Mr. Spock’s home planet.

It is right where “Star Trek” creator Gene Roddenberry said it could be—in a solar system surrounding 40 Eridani A, a star 16 light years across the final frontier. “I hadn’t even realized the planet might be Vulcan until someone brought it up after the paper about the planet ’s discovery was published,” says Barnes, who was part of a team working on the Dharma Planet Survey to detect potentially habitable super-Earth planets in other solar systems. It’s findings like this, as well as the fast-changing and increasing role of technology in our contemporary lives—from smartphones to personal drones—that can make us feel like we’re living in a science fictional future. A sci-fi world is no longer something we have to imagine—it’s as close as a stroll on a UW campus. In the Forestry Building, a scientist is trying to figure out how to grow broccoli on Mars. Just down the road, a team is storing massive amounts of data in molecules of DNA. And over in Mechanical Engineering, students on the UW Hyperloop team are building a pod that can travel several hundred miles per hour in a vacuum tube. In the Paul G. Allen Center for Computer Science & Engineering, robotics professor Siddhartha Srinivasa and his lab are working on a Home Exploring Robot Butler (HERB). The service robot that can perform a range of chores has already been featured in National Geographic and Wired. Now HERB is bringing the worlds of science and sci-fi even closer with a recent appearance on an episode of “The X-Files.” In a nearby classroom, Howard Chizeck, an electrical engineering professor whose research includes electronically stimulating the human brain to manage movement disorders, is driving the focus of a freshman class straight to the crossroads of fiction and science. Titled “Indistinguishable from Magic: New Technologies, Science Fiction, and Us,” the course covers writers like Isaac Asimov and Alice B. Sheldon as well as current technology and the potential cultural changes it may trigger. Chizeck sets the table for a feast of science that owes a debt to science fiction.

Lots and lots of scientists and engineers read science fiction and if they see it and like it, they try to do it.

HOWARD CHIZECK, ELECTRICAL ENGINEERING PROFESSOR

“This is not a normal course,” Chizeck tells the room of new students, “This is an electrical engineering course, and I will offer some of that. But it is also a science fiction course.” He goes on to explain that most of us now use amazingly powerful electronic objects, yet we have little understanding of how they work or how they’ve changed us. How do science fiction, technology and society influence and impact one another, he asks the students. What are the profound ramifications of the technologies we’re embracing? Having been in computer science since the beginning of computers, Chizeck has had a close-up view of how technology has evolved and how it has, in turn, changed society. He has also kept an eye on how fiction writers have responded, followed and sometimes led. “When writers are writing science fiction, they’re writing for the society they’re in,” he says. “At the same time, the science fiction they write has changed society.” That first class touches on the birth of the internet as well as the invention of the first cellphone in 1973 and its evolution to a product for everyone by the 1990s. They also talk about Bisphenol A (BPA) plastics used in water bottles and food cans being linked to infertility. Chizeck then points to the prescient, futuristic focus of the 1985 book “The Handmaid’s Tale,” which is built around a fundamentalist government, fertility challenges, and women being treated as property of the state. He circles that point around to the current TV show based on the book and the handmaids’ costumes worn by protesters wanting to draw attention to the government’s neglect and abuse of women’s rights. “Good science fiction has an understanding of the real,” he tells the class. It also helps us imagine the future so we can explore the benefits and the harm that our inventions and discoveries can bring. Fiction lives within our culture, he says. For example, writer Philip K. Dick (“Blade Runner” and “Total Recall”) imagined autonomous vehicles, virtual reality, and insects outfitted with sensors. “It was like he could see into the future,” says Chizeck. “Lots and lots of scientists and engineers read science fiction and if they see it and like it, they try to do it,” he says. He points to the now-classic Motorola flip phone. “That’s based on the communicator straight out of ‘Star Trek.’” A few weeks after that first class, Chizeck invites Hugo and Nebula award-winning sci-fi writer Nancy Kress to speak to the class. She is one of a group of stellar sci-fi writers including Octavia Butler, William Gibson, Ted Chiang, Cat Rambo and Ursula K. Le Guin, who made the Northwest their home. Perched on a table at the front of the room, Kress spells out a range of ideas from machine learning, genetic engineering, conversational AI and fire-resistant wallpaper with thermal-sensor nanowires that allow it to serve as a fire alarm. “Any technology is a tool,” Kress tells the students. “It is going to have good outcomes and bad outcomes.” What are some of the downsides? The students are ready with answers: people are losing their jobs to machines, plastic guns that can now be made by anyone with plans and a 3D printer, and loss of privacy and hacking. Kress is one of a group of sci-fi writers invited to Microsoft’s research labs to see the projects and then write science fiction stories featuring technology we may be using in the near future. Her story “Machine Learning” plays with a human working with holographic projections (think Princess Leia in “Star Wars”). The talk with the class then turns to her short story “Nano Comes to Clifford Falls,” the tale of a small town and the arrival of nanomachines that make whatever anyone desires—food, clothes, even cars and homes. The story is told from the point of view of a skeptical single mother reluctant to bring the “made” products into her home. Because the tech brings them nearly everything, most of the townspeople stop working, the community stops functioning and ultimately, it collapses. “I wanted to focus on the human element,” Kress says. “What can go wrong, what can go right?” The uses of technology might not be exactly what the designers and engineers were imagining. As William Gibson explains, “The street finds its own uses for things—uses the manufacturers never even imaged.” Facebook, for example, was not built for Russian bots, says Kress.

It’s an incredibly hard problem, but I’d like the robot to be able to pick up a zucchini as well as twirl spaghetti.

SIDDHARTHA SRINIVASA, ROBOTICS PROFESSOR

While sci-fi writers are inspired by the possibilities that come with new scientific discoveries and technological inventions, inventors, scientists and engineers—including many at the UW—are inspired by those stories. “I grew up reading science fiction,” says Srinivasa, a nationally-known robotics expert who moved his entire lab from Carnegie Mellon University to the UW in 2017 and who just this fall joined Amazon as the company’s robotics director. “When I was six years old, I spent the summer in my grandparents’ house. My granddad’s bookshelf had Ray Bradbury’s ‘The Martian Chronicles.’ While I struggled with a lot of the words, it was the most fascinating book that I had ever read. Every year I would go back and pull it out and read it again. I still read it.” That 1950 book of short stories focuses on the efforts of humans to reach and colonize Mars, which is already home to another race of beings. Each time Srinivasa reads it, he finds more meaning, and new fodder for his imagination. “That’s a hallmark of great science fiction,” he says. “It envisions a future and challenges us into thinking how to achieve that. Everything that I have done has really stemmed from that day I discovered the book. “I think science fiction has played a really integral part in robotics and computer science.” Different cultures have accepted robots and technology in different ways. “Japan for example—after the Second World War, Astro Boy was this superhuman robot boy. He came out and did great things. I think perception of robots and technology in Japan has been forever colored by that cartoon.” By contrast, he notes, in the United States, many of us grew up watching “Terminator” (where a cyborg assassin comes from the future to kill.) “Here, the future is much more dystopian than pleasant,” Srinivasa says. “I think the truth of our future lies somewhere in between.” Srinivasa has been building robots for 19 years. Early on he started thinking about how and where he would want his robots to work. If it’s on a factory floor, the robot can do its task in isolation. “In that setting it should just be efficient,” he says. “But if you want it to work around people, you can either treat people as obstacles that should be avoided or you can make it a collaborative approach.” The robots he’s building are to be programmed to respond to human behavior, predict their intentions and harmonize with their actions. He offers the example of two people putting together a chair from Ikea. “It is a delicate social dance of deciding what goes where and who does what and what to pick up,” he says. “I want robots to participate in this dance, this discourse. Robots need to understand how humans work.” He helps them understand through mathematics, the mathematics of humans as dynamical systems. Latent feelings leak through our actions and interactions, he explains. These are easy things for humans to read, but very challenging for a robot. He’s working on mathematical models that will help a robot understand “the things we understand about others so intuitively. And that’s not easy.” His goal is to develop robots for assistive care. “I want to build robots that can actually help people,” he says. Wanting to assist people with limited mobility, he and his students are refining a robot that helps you eat. “It’s an incredibly hard problem, but I’d like the robot to be able to pick up a zucchini as well as twirl spaghetti,” he says. So how long until we have robots in our homes? “I have a few answers to that,” Srinivasa says. “We seem to think it’s nothing, nothing, nothing and suddenly we have Rosie the Robot [the robot maid from 1960s animated sitcom “The Jetsons”] in our home. But really, we’ll have technologies along the way: the thermostat, the smart refrigerator, the vacuum cleaner. All these technologies are building the ecosystem for the robot. It will talk to your microwave. It will talk to your fridge. It’s not just one robot, but an ecosystem of robots. “I think that oftentimes when we say we want to build Rosie the Robot, we forget why that robot was created. We focus on the technology and say it would be cool to have arms and eyes. But we should remember that robot was created for caregiving. We need to go back and think about why we create the technology. How can we build caregiving technology that actually gives people care?” We venture down a few floors from his office to his lab, where dozens of computer screens stare out from tables that pack the room. At one end, a black-curtained area houses ADA (Assistive Dexterous Arm), a robotic arm designed to attach to a wheelchair and assist with tasks like eating. Nearby, a student reviews a video of ADA feeding celery to a person. The robot picks up the food item and holds it for the woman to eat. The woman laughs because ADA is holding the celery upright. In order to take a bite, she has to turn her head 45 degrees. It’s not ideal, but it’s helping ADA learn there might be a better way to hold the food. HERB, another of Srinivasa’s robots, is not quite as tall as most human adults. He has two long multi-jointed arms with three-fingered hands and he rolls around on wheels. One day he could be unloading a dishwasher or cracking open an Oreo and another he may be performing in some sci-fi movie or TV show. In March, the robot butler made an appearance on the “The X-Files” as a worker at a sushi restaurant. The episode’s theme was technology turning on people. When Mulder decides not to leave a tip, the automated restaurant and the tech outside—smartphones, drones and even a home security system—retaliate.

Either we become aware that Earth is the only home for life in the universe or that there’s some galactic society out there and we probably want to be a part of it.

RORY BARNES, ASTRONOMY ASSISTANT PROFESSOR

As a kid in Tucson, where he discovered a love for the night sky, Barnes watched “Star Trek.” He found the show more entertaining than inspiring. He admits that though he has spent his academic life steeped in science, he isn’t much one for science fiction. “My tastes tended toward real science,” he admits. But he has so much in common with the show—which is all about exploring “strange, new words” and seeking out “new life.” Barnes’ specialty is exoplanets—planets in far away solar systems that might support life, long the subject of sci-fi writers like Edgar Rice Burroughs, whose Mars (also known as Barsoom) was home to a Martian race, and Frank Herbert (a Tacoma native who attended the UW for a time) whose “Dune” planet of Arrakis was covered in desert. Only for Barnes, life out there in the universe is now far more likely than those writers could have known. Twenty-five years ago, we didn’t really know anything of planets outside our solar system, says Barnes. But then in 1995, the first exoplanet was discovered. It is now called a “hot Jupiter,” a gas giant with a short orbital period. “The planet, it turns out, was somewhat unusual. It was just that our technology finally allowed us to see it,” Barnes says. After that, the floodgates opened and thousands of potential exoplanets—including the one in orbit around 40 Eridani A—have been identified. “We still don’t know if there are really any habitable worlds out there at all,” Barnes says. “But as Isaac Asimov said, either way, the answer to the question ‘Are we alone?’ is staggering,” he says. “Either we become aware that Earth is the only home for life in the universe or that there’s some galactic society out there and we probably want to be a part of it.” Our interview over, I left the Physics and Astronomy Building and started down a long flight of stairs to 15th Avenue N.E. I chanced to look up at the view in front of me. There was the Space Needle. It was a trick of perspective, but it appeared to hover over the north end of Capitol Hill. I descended a few more steps, and it landed, disappearing into the neighborhood.
[post_title] => At the UW, our sci-fi future has arrived [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => at-the-uw-our-sci-fi-future-has-arrived [to_ping] => [pinged] => [post_modified] => 2018-12-13 15:58:40 [post_modified_gmt] => 2018-12-13 23:58:40 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13936 [menu_order] => 1 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [1] => WP_Post Object ( [ID] => 13927 [post_author] => 22 [post_date] => 2018-05-10 11:44:48 [post_date_gmt] => 2018-05-10 18:44:48 [post_content] => Professor Mari Ostendorf won the 2018 IEEE James L. Flanagan Speech and Audio Processing Award for her innovative research and improving spoken language technology. "Dr. Ostendorf's extensive contributions to natural language processing and speech technology has been playing a major part in conquering the limitations of spoken language technology," said Radha Poovendran, professor and chair of electrical & computer engineering. "She is a pioneer in interactive conversational AI, a multi-billion dollar industry." [post_title] => Ostendorf wins 2018 Flanagan award [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => ostendorf-wins-2018-flanagan-award [to_ping] => [pinged] => [post_modified] => 2018-12-10 13:17:50 [post_modified_gmt] => 2018-12-10 21:17:50 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13927 [menu_order] => 2 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [2] => WP_Post Object ( [ID] => 13870 [post_author] => 22 [post_date] => 2018-11-28 15:18:12 [post_date_gmt] => 2018-11-28 23:18:12 [post_content] => By Sarah McQuate UW News [caption id="attachment_13872" align="alignleft" width="300"]Ivana Milovanovic UW postdoc Ivana Milovanovic (left) works with Center for Neurotechnology Young Scholars Program participant Emily Boeschoten on a sensory device.Mark Stone/University of Washington[/caption] Brain-computer interfaces have the potential to give patients better and more natural control over their prosthetic devices. Through this method, a chip in a patient’s brain picks up a thought — neural activity triggered by focusing on specific visual imagery — to move a joint and then transmits that signal to the prosthetic. This technology is not widely available yet, but as it progresses through research trials, ethical questions are emerging about users’ sense of control over their own actions. For example: Who is responsible if a prosthetic limb malfunctions and strikes someone in a crowd — the patient or the device? To address these types of questions, University of Washington researchers in the Center for Neurotechnology, including ECE's professor Howard Chizeck, are studying how brain-computer interfaces affect whether patients feel they are in charge of their own actions. For this research, the team will receive $1.5 million from the National Institutes of Health over the next four years. “Neuroscience offers a deeper understanding of the brain and gives us the prospect of new ways to treat diseases or affect how the brain functions,” said Sara Goering, a UW associate professor of philosophy and the team lead for the project. “Given how closely we associate the function of our brains with who we are as individuals, it is valuable to explore the implications of this research. Then people can better understand their options before enrolling in a study, and researchers can design devices that better suit users’ needs and values.” [caption id="attachment_13871" align="alignright" width="300"]Tim Brown Tim Brown, a doctoral student in the UW’s philosophy department and a researcher involved in the Center for Neurotechnology project, is already embedded in the UW’s BioRobotics Lab. He studies autonomy issues that arise for people with Parkinson’s disease or essential tremor when they use deep brain stimulation to manage their symptoms.Mark Stone/University of Washington[/caption] The team aims to examine multiple types of brain-computer interfaces that currently are being tested in clinical studies, not just those that control prosthetics. In using deep brain stimulation to treat Parkinson’s disease or depression, a patient might wonder: Is my action the result of something I did, or something the stimulator did? And with devices that help patients sense touch, a patient might ask: Is this interface correctly telling me how hard I am squeezing someone’s hand? By looking at how these devices affect the degree to which patients sense they are in control of their own actions and emotions, the researchers hope to develop tools that can help future patients feel better equipped to remain in control. “We will start this process by ’embedding’ ethicists within neural engineering labs that are studying different brain-computer interfaces,” said Dr. Eran Klein, an affiliate assistant professor of philosophy at the UW and an assistant professor of neurology at Oregon Health & Science University who is the co-leader on this project. “These ethicists will work side by side with the researchers in the lab, as well as interview both researchers and research participants about their perspectives on brain-computer interfaces.” The grant will fund one or two new researchers who will work in different labs over the course of the project. Some of these labs are part of the Center for Neurotechnology, which is based at the UW, while others are at Massachusetts General Hospital, University of Freiburg in Germany, University of Utrecht in the Netherlands, Brown University and Caltech. After compiling perspectives across all labs, the team will develop a series of questions to give to future patients who are considering enrolling in a study to receive a brain-computer interface. They can use these questions to better prepare for informed consent discussions with researchers, Goering said. “We’re hoping that the close attention we pay to users’ experiences operating brain-computer interfaces will help us understand how to help prospective users be informed about the tradeoffs they might be making,” she said. “In addition, we also want to help researchers in the field think carefully about next-generation device design, so that this technology will maintain or enhance a user’s sense of control.” [post_title] => UW-led philosophy team receives $1.5M grant to study the ethics of neurotechnology research [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => uw-led-philosophy-team-receives-1-5m-grant-to-study-the-ethics-of-neurotechnology-research [to_ping] => [pinged] => [post_modified] => 2018-11-28 15:19:08 [post_modified_gmt] => 2018-11-28 23:19:08 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13870 [menu_order] => 3 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [3] => WP_Post Object ( [ID] => 13849 [post_author] => 22 [post_date] => 2018-11-26 10:18:47 [post_date_gmt] => 2018-11-26 18:18:47 [post_content] => The National Science Foundation announced that it is awarding grants totaling $8.5 million to 19 collaborative projects at 23 universities for the study of complex and entrenched problems in data science. Three of these projects will be based at the University of Washington. The grants build on a 2017 award in the Transdisciplinary Research in Principles of Data (TRIPODS) program, which led to the founding of the Algorithmic Foundations of Data Science Institute (ADSI) at UW. ADSI focuses on developing theoretical and algorithmic tools bridging mathematics, computer science, and statistics for addressing contemporary data science challenges. The new grants make up the TRIPODS+X program, which expands these big-data projects into broader areas of science, engineering, and mathematics. Electrical & computer engineering associate professor Maryam Fazel, who co-directs ADSI, is the lead researcher in “Foundational Training in Neuroscience and Geoscience via Hack Weeks.” [caption id="attachment_13851" align="alignleft" width="300"]NeuroHackademy Students at the NeuroHackademy.[/caption] “Our training will be designed to bridge the gap between state-of-the-art foundational research in mathematical and algorithmic tools, with scientific applications, which is much in spirit of the TRIPODS+X program,” Fazel said. Her TRIPODS+X project will enhance the successful “hack week” model: Hack weeks blend elements of traditional lecture-style pedagogy and participant-driven projects. “In the new grant, we will leverage a new and exciting model for education and collaboration, namely the hack-weeks model, a format that UW's own eScience institute has done pioneering work on developing,” Fazel said. “The project is exciting because we get to connect our theory and algorithms to two important scientific domains, neuroscience and geoscience, which in recent years have had access to massive data sets and where data science tools are poised to help make breakthroughs.” Fazel will lead a multi-disciplinary team of UW researchers on this project, which includes Ariel Rokem at the eScience Institute; Anthony Arendt at the Applied Physics Laboratory; Aleksandr Aravkin, assistant professor of applied mathematics; and Zaid Harchaoui, assistant professor of statistics and eScience Institute fellow. Fazel is also a researcher on another TRIPOD+X project, “Safe Imitation Learning for Robotics,” which is led by Harchaoui. This project will focus on developing mathematically rigorous algorithms for imitation learning in robotics, a form of learning in which a robotic system learns through demonstration.         [post_title] => Professor Fazel to lead NSF TRIPODS+X in data science [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => fazels-research-in-data-science-helps-secure-nsf-tripodsx-grants [to_ping] => [pinged] => [post_modified] => 2018-11-28 13:24:23 [post_modified_gmt] => 2018-11-28 21:24:23 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13849 [menu_order] => 4 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [4] => WP_Post Object ( [ID] => 13839 [post_author] => 22 [post_date] => 2018-11-16 15:11:40 [post_date_gmt] => 2018-11-16 23:11:40 [post_content] => In optics, the era of glass lenses may be waning. In recent years, physicists and engineers have been designing, constructing and testing different types of ultrathin materials that could replace the thick glass lenses used today in cameras and imaging systems. Critically, these engineered lenses — known as metalenses — are not made of glass. Instead, they consist of materials constructed at the nanoscale into arrays of columns or fin-like structures. These formations can interact with incoming light, directing it toward a single focal point for imaging purposes. But even though metalenses are much thinner than glass lenses, they still rely on “high aspect ratio” structures, in which the column or fin-like structures are much taller than they are wide, making them prone to collapsing and falling over. Furthermore, these structures have always been near the wavelength of light they’re interacting with in thickness — until now. In a paper published Oct. 8 in the journal Nano Letters, a team from the University of Washington and the National Tsing Hua University in Taiwan announced that it has constructed functional metalenses that are one-tenth to one-half the thickness of the wavelengths of light that they focus. Their metalenses, which were constructed out of layered 2D materials, were as thin as 190 nanometers — less than 1/100,000ths of an inch thick. [caption id="attachment_13840" align="alignright" width="439"]Metalens prototype Four ultrathin metalenses developed by University of Washington researchers and visualized under a microscope[/caption] “This is the first time that someone has shown that it is possible to create a metalens out of 2D materials,” said senior and co-corresponding author Arka Majumdar, a UW assistant professor of physics and of electrical and computer engineering. Their design principles can be used for the creation of metalenses with more complex, tunable features, added Majumdar, who is also a faculty researcher with the UW’s Molecular Engineering & Sciences Institute and Institute for Nano-Engineered Systems. Majumdar’s team has been studying the design principles of metalenses for years, and previously constructed metalenses for full-color imaging. But the challenge in this project was to overcome an inherent design limitation in metalenses: in order for a metalens material to interact with light and achieve optimal imaging quality, the material had to be roughly the same thickness as the light’s wavelength in that material. In mathematical terms, this restriction ensures that a full zero to two-pi phase shift range is achievable, which guarantees that any optical element can be designed. For example, a metalens for a 500-nanometer lightwave — which in the visual spectrum is green light — would need to be about 500 nanometers in thickness, though this thickness can decrease as the refractive index of the material increases. Majumdar and his team were able to synthesize functional metalenses that were much thinner than this theoretical limit — one-tenth to one-half the wavelength. First, they constructed the metalens out of sheets of layered 2D materials. The team used widely studied 2D materials such as hexagonal boron nitride and molybdenum disulfide. A single atomic layer of these materials provides a very small phase shift, unsuitable for efficient lensing. So the team used multiple layers to increase the thickness, although the thickness remained too small to reach a full two-pi phase shift. “We had to start by figuring out what type of design would yield the best performance given the incomplete phase,” said co-author Jiajiu Zheng, a doctoral student in electrical and computer engineering. To make up for the shortfall, the team employed mathematical models that were originally formulated for liquid-crystal optics. These, in conjunction with the metalens structural elements, allowed the researchers to achieve high efficiency even if the whole phase shift is not covered. They tested the metalens’ efficacy by using it to capture different test images, including of the Mona Lisa and a block letter W. The team also demonstrated how stretching the metalens could tune the focal length of the lens. In addition to achieving a wholly new approach to metalens design at record-thin levels, the team believes that its experiments show the promise of making new devices for imaging and optics entirely out of 2D materials. “These results open up an entirely new platform for studying the properties of 2D materials, as well as constructing fully functional nanophotonic devices made entirely from these materials,” said Majumdar. Additionally, these materials can be easily transferred on any substrate, including flexible materials, paving a way towards flexible photonics. The lead and co-corresponding author on the paper is Chang-Hua Liu, who began this work as a UW postdoctoral researcher and is now a faculty member at the National Tsing Hua University in Taiwan. Additional co-authors are doctoral students Shane Colburn, Taylor Fryett and Yueyang Chen in the Department of Electrical and Computer Engineering; and Xiaodong Xu, a UW professor of physics and of materials science and engineering. The team’s prototype metalenses were all built at the Washington Nanofabrication Facility, a National Nanotechnology Coordinated Infrastructure site on the UW campus. The research was funded by the U.S. Air Force Office of Scientific Research, the National Science Foundation, the Washington Research Foundation, the M.J. Murdock Charitable Trust, GCE Market, Class One Technologies and Google. Thanks to the UW News Staff for this article.  [post_title] => Scientists engineer a functional optical lens out of 2D materials [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => scientists-engineer-a-functional-optical-lens-out-of-2d-materials [to_ping] => [pinged] => [post_modified] => 2018-11-16 15:13:42 [post_modified_gmt] => 2018-11-16 23:13:42 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13839 [menu_order] => 5 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [5] => WP_Post Object ( [ID] => 13828 [post_author] => 22 [post_date] => 2018-11-15 11:24:08 [post_date_gmt] => 2018-11-15 19:24:08 [post_content] => ECE professor Chet Moritz, whose research is in the Restorative Technology Laboratory, is improving brain and spinal cord injuries. Currently, his research, funded by the National Science Foundation has had several breakthroughs using electrical stimulation to damaged nerves. Learn more here.  [post_title] => Repairing damaged nerves [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => repairing-damaged-nerves [to_ping] => [pinged] => [post_modified] => 2018-11-15 11:24:08 [post_modified_gmt] => 2018-11-15 19:24:08 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13828 [menu_order] => 6 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) ) [post_count] => 6 [current_post] => -1 [in_the_loop] => [post] => WP_Post Object ( [ID] => 13936 [post_author] => 22 [post_date] => 2018-12-13 15:58:40 [post_date_gmt] => 2018-12-13 23:58:40 [post_content] =>

By Hannelore Sudermann From Columns

Late this summer a group of astronomers from around the country, including assistant professor Rory Barnes, discovered what could be Vulcan, Mr. Spock’s home planet.

It is right where “Star Trek” creator Gene Roddenberry said it could be—in a solar system surrounding 40 Eridani A, a star 16 light years across the final frontier. “I hadn’t even realized the planet might be Vulcan until someone brought it up after the paper about the planet ’s discovery was published,” says Barnes, who was part of a team working on the Dharma Planet Survey to detect potentially habitable super-Earth planets in other solar systems. It’s findings like this, as well as the fast-changing and increasing role of technology in our contemporary lives—from smartphones to personal drones—that can make us feel like we’re living in a science fictional future. A sci-fi world is no longer something we have to imagine—it’s as close as a stroll on a UW campus. In the Forestry Building, a scientist is trying to figure out how to grow broccoli on Mars. Just down the road, a team is storing massive amounts of data in molecules of DNA. And over in Mechanical Engineering, students on the UW Hyperloop team are building a pod that can travel several hundred miles per hour in a vacuum tube. In the Paul G. Allen Center for Computer Science & Engineering, robotics professor Siddhartha Srinivasa and his lab are working on a Home Exploring Robot Butler (HERB). The service robot that can perform a range of chores has already been featured in National Geographic and Wired. Now HERB is bringing the worlds of science and sci-fi even closer with a recent appearance on an episode of “The X-Files.” In a nearby classroom, Howard Chizeck, an electrical engineering professor whose research includes electronically stimulating the human brain to manage movement disorders, is driving the focus of a freshman class straight to the crossroads of fiction and science. Titled “Indistinguishable from Magic: New Technologies, Science Fiction, and Us,” the course covers writers like Isaac Asimov and Alice B. Sheldon as well as current technology and the potential cultural changes it may trigger. Chizeck sets the table for a feast of science that owes a debt to science fiction.

Lots and lots of scientists and engineers read science fiction and if they see it and like it, they try to do it.

HOWARD CHIZECK, ELECTRICAL ENGINEERING PROFESSOR

“This is not a normal course,” Chizeck tells the room of new students, “This is an electrical engineering course, and I will offer some of that. But it is also a science fiction course.” He goes on to explain that most of us now use amazingly powerful electronic objects, yet we have little understanding of how they work or how they’ve changed us. How do science fiction, technology and society influence and impact one another, he asks the students. What are the profound ramifications of the technologies we’re embracing? Having been in computer science since the beginning of computers, Chizeck has had a close-up view of how technology has evolved and how it has, in turn, changed society. He has also kept an eye on how fiction writers have responded, followed and sometimes led. “When writers are writing science fiction, they’re writing for the society they’re in,” he says. “At the same time, the science fiction they write has changed society.” That first class touches on the birth of the internet as well as the invention of the first cellphone in 1973 and its evolution to a product for everyone by the 1990s. They also talk about Bisphenol A (BPA) plastics used in water bottles and food cans being linked to infertility. Chizeck then points to the prescient, futuristic focus of the 1985 book “The Handmaid’s Tale,” which is built around a fundamentalist government, fertility challenges, and women being treated as property of the state. He circles that point around to the current TV show based on the book and the handmaids’ costumes worn by protesters wanting to draw attention to the government’s neglect and abuse of women’s rights. “Good science fiction has an understanding of the real,” he tells the class. It also helps us imagine the future so we can explore the benefits and the harm that our inventions and discoveries can bring. Fiction lives within our culture, he says. For example, writer Philip K. Dick (“Blade Runner” and “Total Recall”) imagined autonomous vehicles, virtual reality, and insects outfitted with sensors. “It was like he could see into the future,” says Chizeck. “Lots and lots of scientists and engineers read science fiction and if they see it and like it, they try to do it,” he says. He points to the now-classic Motorola flip phone. “That’s based on the communicator straight out of ‘Star Trek.’” A few weeks after that first class, Chizeck invites Hugo and Nebula award-winning sci-fi writer Nancy Kress to speak to the class. She is one of a group of stellar sci-fi writers including Octavia Butler, William Gibson, Ted Chiang, Cat Rambo and Ursula K. Le Guin, who made the Northwest their home. Perched on a table at the front of the room, Kress spells out a range of ideas from machine learning, genetic engineering, conversational AI and fire-resistant wallpaper with thermal-sensor nanowires that allow it to serve as a fire alarm. “Any technology is a tool,” Kress tells the students. “It is going to have good outcomes and bad outcomes.” What are some of the downsides? The students are ready with answers: people are losing their jobs to machines, plastic guns that can now be made by anyone with plans and a 3D printer, and loss of privacy and hacking. Kress is one of a group of sci-fi writers invited to Microsoft’s research labs to see the projects and then write science fiction stories featuring technology we may be using in the near future. Her story “Machine Learning” plays with a human working with holographic projections (think Princess Leia in “Star Wars”). The talk with the class then turns to her short story “Nano Comes to Clifford Falls,” the tale of a small town and the arrival of nanomachines that make whatever anyone desires—food, clothes, even cars and homes. The story is told from the point of view of a skeptical single mother reluctant to bring the “made” products into her home. Because the tech brings them nearly everything, most of the townspeople stop working, the community stops functioning and ultimately, it collapses. “I wanted to focus on the human element,” Kress says. “What can go wrong, what can go right?” The uses of technology might not be exactly what the designers and engineers were imagining. As William Gibson explains, “The street finds its own uses for things—uses the manufacturers never even imaged.” Facebook, for example, was not built for Russian bots, says Kress.

It’s an incredibly hard problem, but I’d like the robot to be able to pick up a zucchini as well as twirl spaghetti.

SIDDHARTHA SRINIVASA, ROBOTICS PROFESSOR

While sci-fi writers are inspired by the possibilities that come with new scientific discoveries and technological inventions, inventors, scientists and engineers—including many at the UW—are inspired by those stories. “I grew up reading science fiction,” says Srinivasa, a nationally-known robotics expert who moved his entire lab from Carnegie Mellon University to the UW in 2017 and who just this fall joined Amazon as the company’s robotics director. “When I was six years old, I spent the summer in my grandparents’ house. My granddad’s bookshelf had Ray Bradbury’s ‘The Martian Chronicles.’ While I struggled with a lot of the words, it was the most fascinating book that I had ever read. Every year I would go back and pull it out and read it again. I still read it.” That 1950 book of short stories focuses on the efforts of humans to reach and colonize Mars, which is already home to another race of beings. Each time Srinivasa reads it, he finds more meaning, and new fodder for his imagination. “That’s a hallmark of great science fiction,” he says. “It envisions a future and challenges us into thinking how to achieve that. Everything that I have done has really stemmed from that day I discovered the book. “I think science fiction has played a really integral part in robotics and computer science.” Different cultures have accepted robots and technology in different ways. “Japan for example—after the Second World War, Astro Boy was this superhuman robot boy. He came out and did great things. I think perception of robots and technology in Japan has been forever colored by that cartoon.” By contrast, he notes, in the United States, many of us grew up watching “Terminator” (where a cyborg assassin comes from the future to kill.) “Here, the future is much more dystopian than pleasant,” Srinivasa says. “I think the truth of our future lies somewhere in between.” Srinivasa has been building robots for 19 years. Early on he started thinking about how and where he would want his robots to work. If it’s on a factory floor, the robot can do its task in isolation. “In that setting it should just be efficient,” he says. “But if you want it to work around people, you can either treat people as obstacles that should be avoided or you can make it a collaborative approach.” The robots he’s building are to be programmed to respond to human behavior, predict their intentions and harmonize with their actions. He offers the example of two people putting together a chair from Ikea. “It is a delicate social dance of deciding what goes where and who does what and what to pick up,” he says. “I want robots to participate in this dance, this discourse. Robots need to understand how humans work.” He helps them understand through mathematics, the mathematics of humans as dynamical systems. Latent feelings leak through our actions and interactions, he explains. These are easy things for humans to read, but very challenging for a robot. He’s working on mathematical models that will help a robot understand “the things we understand about others so intuitively. And that’s not easy.” His goal is to develop robots for assistive care. “I want to build robots that can actually help people,” he says. Wanting to assist people with limited mobility, he and his students are refining a robot that helps you eat. “It’s an incredibly hard problem, but I’d like the robot to be able to pick up a zucchini as well as twirl spaghetti,” he says. So how long until we have robots in our homes? “I have a few answers to that,” Srinivasa says. “We seem to think it’s nothing, nothing, nothing and suddenly we have Rosie the Robot [the robot maid from 1960s animated sitcom “The Jetsons”] in our home. But really, we’ll have technologies along the way: the thermostat, the smart refrigerator, the vacuum cleaner. All these technologies are building the ecosystem for the robot. It will talk to your microwave. It will talk to your fridge. It’s not just one robot, but an ecosystem of robots. “I think that oftentimes when we say we want to build Rosie the Robot, we forget why that robot was created. We focus on the technology and say it would be cool to have arms and eyes. But we should remember that robot was created for caregiving. We need to go back and think about why we create the technology. How can we build caregiving technology that actually gives people care?” We venture down a few floors from his office to his lab, where dozens of computer screens stare out from tables that pack the room. At one end, a black-curtained area houses ADA (Assistive Dexterous Arm), a robotic arm designed to attach to a wheelchair and assist with tasks like eating. Nearby, a student reviews a video of ADA feeding celery to a person. The robot picks up the food item and holds it for the woman to eat. The woman laughs because ADA is holding the celery upright. In order to take a bite, she has to turn her head 45 degrees. It’s not ideal, but it’s helping ADA learn there might be a better way to hold the food. HERB, another of Srinivasa’s robots, is not quite as tall as most human adults. He has two long multi-jointed arms with three-fingered hands and he rolls around on wheels. One day he could be unloading a dishwasher or cracking open an Oreo and another he may be performing in some sci-fi movie or TV show. In March, the robot butler made an appearance on the “The X-Files” as a worker at a sushi restaurant. The episode’s theme was technology turning on people. When Mulder decides not to leave a tip, the automated restaurant and the tech outside—smartphones, drones and even a home security system—retaliate.

Either we become aware that Earth is the only home for life in the universe or that there’s some galactic society out there and we probably want to be a part of it.

RORY BARNES, ASTRONOMY ASSISTANT PROFESSOR

As a kid in Tucson, where he discovered a love for the night sky, Barnes watched “Star Trek.” He found the show more entertaining than inspiring. He admits that though he has spent his academic life steeped in science, he isn’t much one for science fiction. “My tastes tended toward real science,” he admits. But he has so much in common with the show—which is all about exploring “strange, new words” and seeking out “new life.” Barnes’ specialty is exoplanets—planets in far away solar systems that might support life, long the subject of sci-fi writers like Edgar Rice Burroughs, whose Mars (also known as Barsoom) was home to a Martian race, and Frank Herbert (a Tacoma native who attended the UW for a time) whose “Dune” planet of Arrakis was covered in desert. Only for Barnes, life out there in the universe is now far more likely than those writers could have known. Twenty-five years ago, we didn’t really know anything of planets outside our solar system, says Barnes. But then in 1995, the first exoplanet was discovered. It is now called a “hot Jupiter,” a gas giant with a short orbital period. “The planet, it turns out, was somewhat unusual. It was just that our technology finally allowed us to see it,” Barnes says. After that, the floodgates opened and thousands of potential exoplanets—including the one in orbit around 40 Eridani A—have been identified. “We still don’t know if there are really any habitable worlds out there at all,” Barnes says. “But as Isaac Asimov said, either way, the answer to the question ‘Are we alone?’ is staggering,” he says. “Either we become aware that Earth is the only home for life in the universe or that there’s some galactic society out there and we probably want to be a part of it.” Our interview over, I left the Physics and Astronomy Building and started down a long flight of stairs to 15th Avenue N.E. I chanced to look up at the view in front of me. There was the Space Needle. It was a trick of perspective, but it appeared to hover over the north end of Capitol Hill. I descended a few more steps, and it landed, disappearing into the neighborhood.
[post_title] => At the UW, our sci-fi future has arrived [post_excerpt] => [post_status] => publish [comment_status] => closed [ping_status] => closed [post_password] => [post_name] => at-the-uw-our-sci-fi-future-has-arrived [to_ping] => [pinged] => [post_modified] => 2018-12-13 15:58:40 [post_modified_gmt] => 2018-12-13 23:58:40 [post_content_filtered] => [post_parent] => 0 [guid] => https://www.ece.uw.edu/?post_type=spotlight&p=13936 [menu_order] => 1 [post_type] => spotlight [post_mime_type] => [comment_count] => 0 [filter] => raw ) [comment_count] => 0 [current_comment] => -1 [found_posts] => 626 [max_num_pages] => 105 [max_num_comment_pages] => 0 [is_single] => [is_preview] => [is_page] => [is_archive] => 1 [is_date] => [is_year] => [is_month] => [is_day] => [is_time] => [is_author] => [is_category] => [is_tag] => [is_tax] => [is_search] => [is_feed] => [is_comment_feed] => [is_trackback] => [is_home] => [is_404] => [is_embed] => [is_paged] => [is_admin] => [is_attachment] => [is_singular] => [is_robots] => [is_posts_page] => [is_post_type_archive] => 1 [query_vars_hash:WP_Query:private] => 0f87fe429e20a1f4e53778b54d8d4588 [query_vars_changed:WP_Query:private] => 1 [thumbnails_cached] => [stopwords:WP_Query:private] => [compat_fields:WP_Query:private] => Array ( [0] => query_vars_hash [1] => query_vars_changed ) [compat_methods:WP_Query:private] => Array ( [0] => init_query_flags [1] => parse_tax_query ) ) )
More News
More News Electrical Engineering Kaleidoscope Electrical Engineering eNews