Posts tagged with "Robots":

Placeholder Alt Text

Could buildings be evolved instead of designed?

What if we could “breed” buildings to be more efficient? That’s the provocation by artist, designer, and programmer Joel Simon, who was inspired by the potentials of 3D printing and other emergent digital manufacturing technologies, as well as his background in computer science and biology, to test a system of automated planning. With a series of algorithms of two types—“graph-contraction and ant-colony pathing”—Simon is able to “evolve” optimized floor plans based off different constraints, using a genetic method derived from existing neural network techniques. The results are, according to a white paper he put out, “biological in appearance, intriguing in character, and wildly irrational in practice.” The example he gives is based off an elementary school in Maine. Most schools are long corridors with classrooms coming off the sides, a highly linear design. By attempting to set different parameters, like minimizing traffic flow and material usage, or making the building easier to exit in the event of an emergency, the algorithms output different floor plans, developed on a genetic logic. But this optimization is done “without regard for convention [or] constructability,” and adding other characteristics, like maximizing windows for classrooms, led to complicated designs with numerous interior courtyards. For projects like schools, he suggests, class schedules and school layouts could be evolved side-by-side, creating a building optimized around traffic flow. While perhaps currently impractical (there’s no getting rid of architects—or rectangles— yet!), Simon hopes that the project will push people to think about how building with emergent technologies—like on-site 3D printing, CNC, self-assembling structures, and robotic construction—can be integrated within the design process. These technologies have promises for new forms that are hard to design for, he believes, and potentials that can’t be realized through existing design methods. As he told Dezeen: "Most current tools and thinking are stuck in a very two-dimensional world…[but,] designing arbitrary 3D forms optimized for multiple objectives—material usage, energy efficiency, acoustics—is simply past human cognitive ability."

Open Call: R+D for the Built Environment Design Fellowship

R+D for the Built Environment, is sponsoring a 6-month, paid, off-site design fellowship program starting this summer. We're looking for four candidates in key R+D topic areas:
  1. Building material science
  2. 3D printing, robotics, AR/VR
  3. AI, machine learning, analytics, building intelligence
  4. Quality housing at a lower cost
  5. Building resiliency and sustainability
  6. Workplace optimization
  7. Adaptable environments
We're excited to support up-and-coming designers, engineers, researchers (and all the disciplines in between!) advance their work and provide them with a platform to share their ideas. Follow the link below for more details and instructions on how to apply. Applications are due by May 31, 2019. https://sites.google.com/view/rdbe-design-fellowship-2019/home  
Placeholder Alt Text

Brooklyn-based startup is using robots for rebar assembly

Two Brooklyn-based construction entrepreneurs began their business with a simple observation: steel rebar, used in concrete construction throughout the world, isn't always easy to work with. Ian Cohen and Daniel Blank noticed this when they were watching wind turbines being erected. “Watching the process of people manually moving these huge, heavy objects looked dangerous and difficult,” Cohen explained. Often made from scrap metal, rebar is a “really sharp, dirty material for humans to interact with.” They pivoted their URBAN-X accelerated startup, Toggle, which they founded two-and-a-half years ago with a focus on renewable energy, to the even more fundamental work of making the production of reinforced concrete faster and safer through automation. Rebar steel is “traditionally manually picked up and erected into cages and shaped to hold reinforced concrete structures in place,” explained Cohen. These cages may be as long as 50 feet. That’s hard work for humans but is exactly the kind of job robots are suited for: taking very heavy things and moving them precisely. Using customized industrial robots, Toggle made modifications that allow the automated arms to “achieve bespoke movements.” The design-to-build process is also streamlined, with custom software that takes a design file, evaluates types of cages needed, then derives a build sequence, and goes straight into digital fabrication. Currently, Toggle, which is in the early stages of its technology, is using a “cooperative process”—a human and robot working side by side. The robot does the dangerous work and heavy lifting, picking up and manipulating the bars, while the human does just the final wire tying. Toggle is in the process of automating this step as well, aiming to increase productivity over all-human rebar processing by as much as five times while halving the cost. The two also plan on adding a linear track that would allow the robot to produce larger meshes, though currently, they are operating at a fairly substantial maximum of 20 feet. No mere experiment, the robot is currently being put to work, fabricating rebar for projects in New York City and the surrounding area. Part of the plan is to develop a system that works something like vertical farming, Cohen explained, where production happens close to where there is need, minimizing the logistical demands and long-distance transportation and “allowing civil infrastructure to be developed and constructed in the societies that need them most.” New York, of course, is a perfect testing ground with its constant construction. Currently, global labor shortages, including in the U.S., make infrastructure construction expensive according to Cohen. Toggle’s goal is to “reduce cost and accelerate construction projects around the world, all while maximizing safety.” The intent, Cohen says, is not about getting rid of human labor but about “taking work away from humans that is not suited for them and putting them in jobs that are better for humans.”
Placeholder Alt Text

The solar-powered FutureHAUS is coming to Times Square

New housing is coming to Times Square, at least temporarily. The Virginia Tech team of students and faculty behind the FutureHAUS, which won the Solar Decathlon Middle East 2018, a competition supported by the Dubai Electricity and Water Authority and U.S. Department of Energy, will bring a new iteration of its solar-powered home to New York for New York Design Week in collaboration with New York City–based architects DXA Studio. The first Dubai iteration was a 900-square-foot prefab home, that, in addition to being entirely solar powered, featured 67 “futuristic devices,” centered around a few core areas including, according to the team’s website: “entertainment, energy management, aging-in-place, and accessibility.” This included everything from gait recognition for unique user identities and taps that put out precise amounts of water given by voice control to tables with integrated displays and AV-outfitted adjustable rooms. One of the home’s biggest innovations, however, is its cartridge system, developed over the past 20 years by Virginia Tech professor Joe Wheeler. The home comprises a number of prefabricated blocks or "cartridges"—a series of program cartridges includes the kitchen and the living room, and a series of service cartridges contained wet mechanical space and a solar power system. The spine cartridge integrates all these various parts and provides the “central nervous system” to the high-tech house. These all form walls or central mechanical elements that then serve as the central structure the home is built around, sort of like high-tech LEGO blocks. The inspiration behind the cartridges came from the high-efficiency industrial manufacturing and assembly line techniques of the automotive and aerospace industries and leveraged the latest in digital fabrication, CNC routing, robotics, and 3D printing all managed and operated through BIM software. Once the cartridges have been fabricated, assembly is fast. In New York it will take just three days to be packed, shipped, and constructed, “a testament to how successful this system of fabrication and construction is,” said Jordan Rogove, a partner DXA Studio, who is helping realize the New York version of the home. The FutureHAUS team claims that this fast construction leads to a higher-quality final product and ends up reducing cost overall. The cartridge system also came in handy when building in New York with its notoriously complicated permitting process and limited space. “In Dubai an eight-ton crane was used to assemble the cartridges,” explained Rogove. “But to use a crane in Times Square requires a lengthy permit process and approval from the MTA directly below. Thankfully the cartridge system is so versatile that the team has devised a way to assemble without the crane and production it would've entailed.” There have obviously been some alterations to the FutureHAUS in New York. For example, while in Dubai there were screen walls and a courtyard with olive trees and yucca, the Times Square house will be totally open and easy to see, decorated with plants native to the area. The FutureHAUS will be up in Times Square for a week and a half during New York’s design week, May 10 through May 22.
Placeholder Alt Text

Robots and poetry come alive at Black Imagination conference

On a humid, gray morning at Princeton in a cubic glass pavilion in a robot arm–equipped garage, architect Mario Gooden sat on a stool silently while discordant sounds emanated from two televisions flanking him that played images, barely visible under the sun streaming in through the translucent walls. Us viewers sat on the benches wrapping the room. Gooden moved his stool and sat again. Finally, he began speaking. Reading from a black folder he talked about space-time, general relativity, and black holes, and about the Black Panthers, being age 13, being American, cinema’s star-crossed lovers, the “image-city,” being in the wake, or being the wake. So began the second day of Black Imagination Matters (BIM, so named to “scramble” the usual meaning of the acronym in architecture), a two day conference organized by V. Mitch McEwen, which was the culmination of a month of workshops this past March and April which included prototyping fictive technologies from W.E.B. Du Bois’s recently-discovered short story “The Princess Steel” as well as choreography workshops with drones. This past weekend's events showcased numerous architects, theorists, writers, and artists thinking about “architechnipoetics,” or the intersections between the ways we make our world in bricks and circuits and words and movement. “You’re composing images with your body,” Gooden incanted over syncopated, and at times, dissonant sounds. Eventually he fell back to silence, though the soundtrack continued. At the end of Gooden’s silence, McEwen asked for our own. The sky cleared. Then, a slightly more usual panel with Jenn Nkiru, beth coleman, and Jerome Haferd. coleman began by asking one of the most difficult questions of all: “What would it be to be free?” Many others joined in during the conversation. In response to discourse on the spiritual and celestial, author Sharifa Rhodes-Pitts, who presented later and then spoke in a panel alongside artist Mario Moore, offered that the many ways of telling and documenting time in various African traditions is something that, to some extent, can be known, and that the archive of such traditions, no matter how troubled, perhaps offers some grounding. The BIM Incubator was incredibly capacious for an event organized by an architecture school, bringing together poets, dancers, filmmakers, scholars, technologists, architects, and others who presented and celebrated inter- and antidisciplinary approaches to thinking about space, building, the future of the city, and the power of Blackness within it. Collaborative and open, BIM was modeled on Donna Haraway’s use of the notion of "sympoiesis," a process of collective making and knowledge production. Science fictions and science presents were blended throughout the event's discussions, most especially by Haferd, whose presentation on his projects in Harlem’s Marcus Garvey Park began with Ursula K. Le Guin; Samuel R. Delany, Octavia Butler, and Sun Ra all got namedropped throughout the day. Saturday’s events took place between Princeton’s Architecture Laboratory and the robot-outfitted Embodied Computation Lab. Rhythm; the water; terrestrial freedom; celestial freedom; the archive; the body; time; telling time; telling times; (im)permanence; the traps and powers, the uses and uselessness of representation; visibility and its transgression (what is secrecy and sacredness in an era of mass surveillance and documentation?, probed Nkiru); the meaning of “practice”; the problem of authenticity— these all were themes that were returned to throughout the day. The convening of so many people itself was a sort of architectural act, making a space through a day of ongoing interactions of speech, sound, images, and movement. And there was so much movement, especially for a university workshop, not only in Gooden’s multimedia performance but also in poet Douglas Kearney’s listening workshop during which he permitted participants to be as still or move as much as they felt to the music and sound he had created. Amina Blacksher next presented her double Dutch robots, two robot coordinated robot arms, that with the help of a human being, became semi-automated jump ropes. After Blacksher went first, people took turns trying to show off their skills. Despite their supposed “precision,” the robots have difficulty being as accurate, synchronized, and quick as young Black girls who jump rope, showing the incredible complexity of embodied and kinetic intelligence that is so often devalued and overlooked. Also working with robots, Lauren Vasey demonstrated the early stages of robots that used facial recognition to behave differently based on people's features, raising questions about the built-in algorithmic biases in new AI technologies. Then came an especially energetic three-person dance piece choreographed by Olivier Tarpaga titled WHEN BIRDS REFUSED TO FLY. All this motion suggested that perhaps architecture doesn’t stand, but rather, and more accurately, buildings balance. The day ended with sunset as Kyp Malone played his guitar and sang, accompanied by projections he’d designed.
Placeholder Alt Text

From research to practice: Catching up with Jenny Sabin

Being able to translate research finds into practical applications on a construction site is never a sure thing, but having a lab-to-studio pipeline definitely helps. For Jenny Sabin, that means a close integration between her lab at the Cornell College of Architecture, Art, and Planning (AAP), and her eponymous studio in Ithaca, New York. Sabin wears three hats: A teacher with a focus on emerging technologies at Cornell, principal investigator of Cornell’s Sabin Design Lab, and principal of Jenny Sabin Studio. The overlap between the lab and the studio means that Sabin has an incubator for fundamental research that can that can be refined and integrated into real-world projects. When AN last toured the Sabin Design Lab, researchers were hard at work using robot arms for novel 3D printing solutions and were looking at sunflowers for inspiration for designing the next generation of photovoltaics. The projects stemming from fundamental research have been realized in projects ranging from the ethereal canopy over MoMA PS1’s courtyard in 2017 to a refinement of the studio’s woven forms for a traveling Peroni pop-up. Rather than directly referencing nature in the biomimetic sense, Sabin’s projects instead draw inspiration from, and converge with, natural processes and forms. Here are a few examples of what Sabin, her team, and collaborators are working on. PolyBrick Brick and tile have been standardized construction materials for hundreds of years, but Sabin Design Lab’s PolyBrick pushes nonstandard ceramics into the future. The first iteration of PolyBrick imagined an interlocking, component-based “brick” that could twist, turn, and eliminate the need for mortar. PolyBrick 1.0 used additive 3D printing to create hollow, fired, and glazed ceramic blocks that could one day be low-cost brick alternatives that would enable the creation of complex forms. PolyBrick 2.0 took the concept even further by emulating human bone growth, creating porous, curvilinear components that Sabin and her team of researchers and students hope to scale up to wall and pavilion size. PolyBrick 3.0 is even more advanced. The 3D-printed blocks contain microscopic divots and are glazed with DNA hydrogel; the polymer coating can react to a variety of situations. Imagine a bioengineered facade glaze that can change color based on air pollution levels or temperature changes, or a component “stamped” with a unique DNA profile for easy supply chain tracking. Responsive textiles As Sabin notes, knitting is an ancient craft, but one that laid the foundation for the digital age; the punch cards used in early computers were originally designed for looms. As material requirements evolve, so too must the material itself, and Jenny Sabin Studio has been experimenting with lightweight, cellular structures woven into self-supporting forms. Sabin’s most famous such installations are gossamer canopies of digitally knit, tubular structures that absorb, store, and re-emit sunlight at night to illuminate repurposed spool chairs. MoMA PS1’s Lumen for YAP 2017, House of Peroni’s Luster, and the 2016 Beauty-Cooper Hewitt Design Triennial installation PolyThread have all pushed textile science forward. As opposed to rigidly defined stonework or stalwart glass, woven architecture takes on ambiguous forms. As GSAPP’s Christoph Kumpusch pointed out while in conversation with Sabin at the House of Peroni opening in NYC last October, these tensile canopies proudly display their boundary conditions instead of hiding them like more traditional forms. The dangling, sometimes-expanded, sometimes-flaccid fabric cones extrude from the cells of the woven canopy and naturally delineate the programming of the area below. These stalactites create the feeling of wandering through a natural formation and encourage a playful, tactile exploration of the space. Kirigami Origami and kirigami (a form of paper folding that requires cutting) are traditional practices that, like other techniques previously mentioned, have seen a modern resurgence in everything from solar sails to airbags. The Sabin Lab has taken an interest in kirigami, particularly its ability to expand two-dimensional representations into three-dimensional forms. The lab’s transdisciplinary research has blended material science, architecture, and electrical engineering to create rapidly deployable, responsive, and scalable architecture that can unpack at a moment’s notice. Two projects, ColorFolds and UniFolds, were made possible by funding from the National Science Foundation. ColorFolds was realized as a canopy of tessellated “blossoms,” each made from polycarbonate panels covered in dichroic film. The modules open or close in response to the density of the crowd below, creating a shimmering exploration of structural color—3M’s dichroic film produces color by scattering and diffusing light through nanoscale structures rather than using pigments. Visitors below the ColorFolds installation were treated to chromatic, shifting displays of light as the flock-like piece rearranged itself. UniFolds reimagined the Unisphere in Queens’s Flushing Meadow Park as part of the Storefront for Art and Architecture show Souvenirs: New New York Icon, which asked architects and artists to produce objects inspired by New York City icons. The 140-foot-tall, 120-foot-diameter landmarked Unisphere was the centerpiece of the 1964 World’s Fair, and Sabin Design Lab’s UniFolds piece references the utopian aspirations of the sphere and domed architecture more broadly. By using holes, folds, and strategic cuts, Sabin Labs has envisioned a modular dome system that’s quick to unfold and can be replicated at any scale, which is part of the “Interact Locally, Fold Globally,” methodology used to guide both kirigami projects.
Placeholder Alt Text

Swiss researchers enlist the help of robots to build high-tech showhome

ETH Zürich’s high-tech showhome opened its doors this past week. The three-story DFAB HOUSE has been built on the NEST modular building platform, an Empa– and Eawag–led site of cutting-edge research and experimentation in architecture, engineering, and construction located in Dübendorf, Switzerland. The 2,150-square-foot house, a collaboration with university researchers and industry leaders, is designed to showcase robotics, 3-D printing, computational modeling, and other technologies and grapple with the interconnected issues of ecology, economy, and architecture. One of the central innovations is using robots that build onsite, rather than create prefabricated pieces in a factory. This In Situ Fabricator (IF) technology, an autonomous “context-aware mobile construction robot,” helps minimize waste and maximize safety during the construction process. To generate concrete geometries not permitted by conventional construction techniques, such as curvilinear shapes that minimize material use, researchers devised a Mesh Mould technology that was built with the aid of vision system–equipped robots. The robots fabricated a structure that acts as both formwork and structural support, a curved steel rebar mesh. The mesh is then filled with concrete, which it acts as a support to. In the DFAB HOUSE, the Mesh Mould is realized as a 39-foot wall, a main load-bearing component of the house, which is able to carry around 100 tons. Despite its complexity—it has 335 layers with over 20,000 welding points—the robot took just 125 hours to construct the mesh. https://youtu.be/ZeLEeY8yK2Y Cantilevered over the Mesh Mould is the so-called Smart Slab, a 3-D printed concrete formwork that supports the timber structure above. Many of the concrete forms in the home are built with what the researchers are calling Smart Dynamic Casting, an automated prefabrication technology. Robotic prefabrication is also used to make the Spatial Timber Assemblies that comprise the upper two levels of the home. The timber structure was devised as part of a collaboration between the university, Gramazio Kohler Research, and ERNE AG Holzbau, who used computational design to generate timber arrangements to fit into the larger structure. The timber assemblies also permit the creation of stiff structures that don’t require additional reinforcement. Applied onto the structure, the hyper-efficient facade is made of membranes of cables, translucent insulating Aerogel, and aluminum. In addition to all the new technology that went into building the DFAB HOUSE, it will also be a “smart home,” using what the researchers are calling the “digitalSTROM platform,” which includes “intelligent, multi-stage burglar protection, automated glare, and shading options, and the latest generation of networked, intelligent household appliances.” It also includes voice control for many of the home’s operations from turning on a kettle to operating blinds. Energy management is also a centerpiece of the home, with rooftop photovoltaic panels featuring a smart control system. Additionally, heat exchangers in the shower trays recover the warmth of shower water, and hot water from faucets is fed back into the boiler when it’s not in use. Not only does it conserve energy and water,  it also prevents bacterial growth in the pipes. The radical use of technology in the DFAB HOUSE is also about optimization and efficiency: the home, with all its undulating formwork and translucent geometries, has been designed to demonstrate how new technology can develop and advance its own aesthetic language to make truly pleasing, compelling spaces. It will also be put to the test. Soon academic guests will be moving in and give life in the DFAB HOUSE a shot. For those who can’t make it to Switzerland, the project will also be presented during Swissnex in San Francisco.
Placeholder Alt Text

Seoul's Robot Science Museum will be its own first exhibition

The soon-to-be-built Robot Science Museum in Seoul, South Korea, will be a robotics exhibition itself. The museum, to be designed by Turkish firm Melike Altınışık Architects (MAA), will be built by robots when construction begins next year. In this way, the construction of the building itself will be the museum’s “first exhibition,” according to principal Melike Altınışık, whose firm is already known for distinctively sci-fi buildings like the 882-foot Küçük Çamlıca TV and Radio Tower, which is currently under construction in Istanbul. The ovoid form of the museum, which will display a range of technologies, including artificial intelligence, virtual reality, augmented reality, and holograms, is designed to create a set of “non-directional” relationships both within the interiors, but also in the public space and the street that surrounds it. The intent is to shift relationships for foot and vehicle traffic and create a more ambiguous flow between inside and out. The entire architectural and visual language of the museum is intended to showcase the museum’s own mission to educate the public on science and technology by using cutting-edge materials and high-tech fabrication techniques, including robotic construction. While the specifics of the robot technology to be used will be announced later this spring, the current plan is to use one “team” of robots to construct the curved metal facade, completing all steps from shaping to assembly to welding and polishing. An additional team of robots will 3-D print concrete, primarily for the spaces surrounding the museum. Both will be directed by BIM systems and help the building itself “manifest robots, science, technology, and innovation,” according to MAA. The museum was commissioned by the Seoul Metropolitan Government and will operate as part of the Seoul Metropolitan Museums with plans to open in full in 2022. It will form part of the Changbai New Economic Center in Seoul’s Chang-dong neighborhood as part of a new cultural center.
Placeholder Alt Text

Australian company aims to build 10 homes this year with autonomous robotic arm

Buzz around robotics in architecture has been steadily building for some time now, though it’s only in the last few years that the technology has seen much real-world action. However, robotic construction technology is seemingly one step closer to the commercial market as Australian company FBR has unveiled plans to bring its robotic bricklaying arm, Hadrian X, out of the testing facility and into the real world. Earlier versions of the Hadrian, which shares the name of the Roman emperor perhaps most famous for his namesake wall that stretched across what is now the United Kingdom, had successfully created buildings in closed environments as early as 2015. This past November, the latest version built a nearly 2,000-square-foot, three-bed, two-bath home in a lab in under three days. After this success, FBR is attempting to take the robot out into the wild, with plans to build ten homes this year. Being billed as “the world’s only fully automated, end-to-end bricklaying solution,” the 100-foot, truck-mounted arm is able to lay as many as 1,000 bricks in a single hour without changing position. It interprets CAD files, calculating required materials on its own, before setting out to make the digital plans a reality. According to a blog post from January 11, FBR has begun preparatory work, including adding additional sensing equipment for weather conditions and the like, for the Hadrian X’s first outdoor build, a three-bedroom home that will be even more complex than the structure built in the last indoor test. Rather than aiming for speed, FBR sees this first build as a chance to “gather as much intel as possible,” in order to make any necessary engineering adjustments and to prepare to launch its autonomous bricklayer for commercial construction use—a business in which the company says there is an “immense” demand for automation. Hoping to make building safer, faster, and less wasteful, CEO Mike Pivac believes that there’s “massive potential for [autonomous building] technology...to shape the way the construction industry operates in the future.”
Placeholder Alt Text

A Tesla struck and "killed" a robot at CES—or did it?

It’s either a documented case of robot-on-robot violence or an elaborate self-perpetuating hoax. At the January 7 opening of the 2019 Consumer Electronics Show (CES) in Las Vegas, a Tesla in “self-driving mode” struck a Russian Promobot, and the event was captured on video. Or did it? The story seemed too good to be true, and touched a nerve over fears that autonomous vehicles could be dangerous (see the case of Uber’s Arizonan test car that got into a fatal crash last March). In the video, a Tesla Model S can be seen cruising by a robot standing curbside, at which point the Promobot falls over and its arm falls off. Promobot’s manufacturer, also called Promobot, posted footage of the incident to Twitter, tagged Elon Musk, and “Promobot was killed by a self-driving Tesla car” racked up over a million views. Promobot claims that its robot was damaged beyond repair and that they would be filing a police report. How did the robot manage to “run off” to the far side of the road without anyone noticing? How did Promobot seem to know that the Tesla was in self-driving mode? Why was the scene being filmed in the first place? The company has thus far been unable to provide answers, but tech writers and Twitter users were quick to point out the inconsistencies in Promobot’s story. Tesla’s cars, while equipped with an “Autopilot” mode that assists drivers on highways, lacks a fully-autonomous self-driving mode. When the driver, George Caldera, was asked for a comment by the Daily Mail, he allegedly told the British tabloid that he had shifted to the passenger seat and handed over control to the vehicle. “I switched this Tesla into a self-driving mode and it started to move. And wow! A robot on the track! I thought the flivver would come round, but it bumped straightly into it! I am so sorry, the robot looks cute. And my sincere apologies to the engineers.” Other than the strange quote, a rope can be seen on the far side of the road near the robot, and Promobot appears to fall over slightly before being passed by the car. Robots and self-driving cars have captured the public’s imagination, but confusion over the capabilities of each have at times also served to confuse. For instance, the robots deployed to ward off homeless people in San Francisco and Waymo’s self-driving cars in Arizona, have both elicited visceral responses from the public. The integration of artificial intelligence into the urban fabric has a long and bumpy road ahead.
Placeholder Alt Text

Bionic construction workers may enter job sites by 2020

Within the next two to three years, wearable technology products may become ubiquitous features of the construction job site, increasing worker safety and productivity. Sarcos Robotics, an American robotics company that specializes in creating mechanical devices for military and public safety purposes, has unveiled its robotic exoskeleton design that allows its operator to carry up to 200 pounds for prolonged periods of time. According to BIM+, the full-body robosuit, formally known as the Guardian XO Max, took 17 years and $175 million to research and develop. It is expected to be commercially available by 2020 and could make a construction site look like something out of a sci-fi film. The exoskeleton was made to help reduce strain on the muscles and joints of construction workers engaged in heavy lifting while giving them a dash of super-human strength. Built with a strength amplification of 20 to 1, a 100-pound steel beam will feel more like a 5-pound weight while wearing the suit, noted BIM+. Its entire weight, as well as the weight of the objects being lifted, is transferred through the suit’s bionic arms and legs and to the ground below. To further ease the movement of the suit so that the operator can flexibly bend, twist, and lift, Sarcos integrated it with a complex network of sensors, enabling the wearer to instinctively control the robot in accordance to their natural reflexes, reducing the need for instruction and training.
Placeholder Alt Text

Suspended structure will house research into space-exploring Japanese robots

New York–based firm Clouds Architecture Office has designed a suspended research facility for AVATAR X, a partnership between ANA Holdings Inc. and Japan Aerospace Exploration Agency (JAXA), the Japanese space agency, developing space-exploring robots. The levitating building will be at the center of the AVATAR X Lab Oita campus, which will host office and laboratory space for various tech companies invited to participate in the partnership's research, along with a lunar-like landscape for testing remotely-operated vehicles. AVATAR X is focused on developing avatars, specialized robots that humans can direct and manipulate from a remote location, thereby obviating the need for humans to go to space themselves. The floating lab structure will stand nearly 60 feet above the bottom of an artificial crater at the center of the campus. A series of other buildings will complete the campus in Oita prefecture, Kyushu, Japan.