Posts tagged with "Virtual Reality":

Placeholder Alt Text

Unity creates new open source tool just for architects with Reflect

Video game software suites like Unreal Engine and Unity have made their way into the architectural arsenal with AEC firms like Skanska, Foster + Partners, and Zaha Hadid Architects using them to visualize and test new buildings. However, these tools weren’t necessarily built with AEC professionals in mind and while they often result in nice-looking environments, they don’t generally offer much in the way of architecture-specific functionality like the ones architectural designers have come to rely upon in BIM and CAD software. To help bridge this gap, the company behind Unity is testing a new piece of software called Reflect. “Unity Pro is a super powerful tool that people use it for creating design walkthroughs and custom application development,” said Tim McDonough, vice president at Unity, “but these firms have a whole bunch of people that would like to be able to view their Revit data easily in a 3D engine like Unity without having to be a software developer, which is what are our current tools built for.” Reflect, which will launch publicly this fall, connects with existing software suites like Revit and Trimble to leverage the vast amounts of data that designers and contractors rely upon, and uses it to create new visualizations, simulations, AR, and VR experiences. Users can view and collaborate across BIM software and Reflect, which are synchronized in real-time across multiple devices for both desktop and mobile. “Users were saying it took them weeks to get data out of Revit into Unity and by the time they got it out, the project had moved on and what was done was irrelevant,’” said McDonough. “We’ve taken out the drudgery so that now what used to take weeks takes just minutes.” https://youtu.be/YnwcGfr0Uk0 A number of firms have already been putting Reflect to the test. Reflect is open source and allows users to develop their own applications, whether for use in their firm or for a broader architectural public. SHoP Architects has been trying out Reflect since the software entered its Alpha phase this summer, creating various solutions to test on their supertall project at 9 Dekalb Avenue in Brooklyn. Adam Chernick‌, an associate at SHoP focusing on AR and VR research, noted that while showing off buildings in software like Unity has become part of standard practice, getting those visualizations attached to critical information has been a challenge up until now. “It hasn't been super difficult to get the geometry into the game engines," he said, "but what has been even more difficult is getting that data into the game engines." One of the first uses for Reflect that the SHoP team devised was an AR application that allowed them to monitor the progress of 9 Dekalb and easily oversee construction sequencing using color-coded panels that map onto the building’s model in their office. Chernick explained that there was a huge amount of exterior window panels to keep track of and that the app really helped. “We wanted to be able to visualize where we are in the construction process from anywhere—whether in VR or AR, and be able to get a live update of its status,” he said. “Now we can watch the building being constructed in real-time.” The SHoP team has also leveraged the power of Reflect—and its integration with Unity—to create new visualization tools for acoustic modeling. “We created an immersive acoustic simulator where you get to see how a sound wave expands through space, reflects off of walls, and interacts with geometry,” said Christopher Morse‌, an associate of interactive visualization at SHoP. “You can slow it down, you can pause it, and you can stop it.” The idea, he explained, is to help architects make acoustic decisions earlier in the design process. “Currently a lot of those acoustic decisions come later and most of the geometry is already decided,” Morse said, noting that at a certain point, all designers can really do is add carpeting or acoustic tiling. “But we want to use these tools earlier and in order for that to actually work, we needed to enable an iterative feedback loop so that you can create a design, analyze and evaluate it, and then make changes based on your analysis." With Reflect, there's also no more grueling import and export process, which Morse said prevented designers from even incorporating tools in their workflow. “Once we had Reflect, we integrated it into our existing acoustic visualization software in order to make that round trip quicker so that people can put on the headset, make a change in Revit, and instantly reevaluate based on those changes.” There is also metadata attached to the geometry, such as material information. While 9 Dekalb is too far along in its construction to incorporate the new software heavily into the design, SHoP’s begun testing out their acoustic modeling app in the lobby of the project. https://youtu.be/f0IA55N_99o Reflect could also provide BIM data in more a user-friendly package to more people working on building projects. “We think that BIM is so valuable, but not enough people get to use it,” said McDonough. “We were trying to figure out how to get BIM in the hands of people on a construction site, so everyone can see all that information at a human scale.” At SHoP, this means creating apps that contractors can use on the job. Currently, their AR apps work on mobile devices, but SHoP hopes that, as AR headsets become more mainstream, they’ll also be able to use the apps on products such as the HoloLens. “This could be a paradigm shift,” says Chernick‌. “We realize that this massive, thousand-sheet set of construction documents that we need to create in order to get a building built is not going anywhere soon. But what we can do is help make this process more efficient and help our construction teams understand and potentially build these projects in more efficient ways.”
Placeholder Alt Text

Aesthetic of Prosthetics compares computer-enhanced design practices

How has contemporary architecture and culture been shaped by our access to digital tools, technologies, and computational devices? This was the central question of Aesthetics of Prosthetics, the Pratt Institute Department of Architecture’s first alumni-run exhibition curated by recent alumni and current students Ceren Arslan, Alican Taylan, Can Imamoglu and Irmak Ciftci. The exhibition, which closed last week, took place at Siegel Gallery in Brooklyn. The curatorial team, made up of current students and recent alumni, staged an open call for submissions that addressed the ubiquity of “prosthetic intelligence” in how we interact with and design the built environment. “We define prosthetic intelligence as any device or tool that enhances our mental environment as opposed to our physical environment," read the curatorial statement. "Here is the simplest everyday example: When at a restaurant with friends, you reach out to your smartphone to do an online search for a reference to further the conversation, you use prosthetic intelligence." As none of the works shown have actually been built, the pieces experimented with the possibilities for representation and fabrication that “prosthetic intelligence” allows. The selected submissions used a range of technologies and methods including photography, digital collage, AI technology, digital modeling, and virtual reality The abundant access to data and its role in shaping architecture and aesthetics was a pervasive theme among the show's participants. Ceren Arslan's Los Angeles, for instance, used photo collage and editing to compile internet-sourced images that create an imaginary, yet believable streetscape. Others speculated about data visualization when drawings are increasingly expected to be read by not only humans, but machines and AI intelligence, as in Brandon Wetzel's deep data drawing.

"The work shown at the exhibition, rather than serving as a speculative criticism pointing out towards a techno-fetishist paradigm, tries to act as recording device to capture a moment in architectural discourse. Both the excitement and skepticism around the presented methodologies are due to the fact that they are yet to come to fruition as built projects," said the curators in a statement. 

Placeholder Alt Text

Morpholio brings Board software to desktop with expanded pro-features and VR

Morpholio, the architect-turned-developer-run company known for its Trace app that blends augmented reality, digital hand drafting, and other architectural tools on portable devices, has brought its interior design program, Board, to desktops for the first time.  Coming on the heels of the new Mac Catalina operating system update, the desktop version of Board leverages the new MacCatalyst developer tool which allows for translating iOS apps to desktop more simply.  Board, which is intended to apply a mood-board logic to technical interior design problems, has been designed for not only professionals but to make home design easier for average consumers. That said, with Board for Mac, Morpholio hopes to “take advantage of the unique properties of the desktop environment," says Morpholio co-founder Mark Collins in a press release from the company, “which is essential for professional work.” The desktop app will include mood board “super tools,” such as layer control and magic wand selection and deletion, as well as a feature called “Ava,” which creates spec sheets for clients and contractors. Ava gives automatic suggestions to match color and forms, and libraries of products from larger companies like Herman Miller and Knoll and smaller designers like Eskayel. It will also include new export features and provide further compatibility with Adobe and Autodesk products (as well as Pinterest). In addition, while Board for mobile already has AR features that allow for furniture to be placed in space at scale, the desktop version will allow for VR integration. “A typical furniture catalog would rely on still images,” says Morpholio co-founder Anna Kenoff, “but Board allows you experience expertly rendered models, created by the storytellers at Theia Interactive. You can view and spin around your favorite furniture pieces and experience them in every dimension. You can zoom in to stitching and materiality and feel the shade and shadows on their forms.” Additional viewing and presentation features will be built in as well and Board will take full advantage of Catalina’s updated Dark Mode for those who prefer to use it. When Apple released MacCatalyst, they definitely had creative professionals in mind,” says Kenoff of the recent Apple release. “They wanted to amplify the power of mobile apps by combining them with the precision capable on a Mac. Few architects and designers work exclusively on a laptop, desktop or tablet. We hope to make our apps available wherever designers are working.”
Placeholder Alt Text

You can paint unbuilt Michael Graves projects in VR

The late Michael Graves has seen his previously unbuilt work finally realized in a virtual reality environment thanks to Imagined Landscapes. The interactive sightseeing experience was created by Kilograph, a Los Angeles creative studio that has worked with firms like Gensler and Zaha Hadid Architects. Based on Graves’s painted plans for the unbuilt Canary Islands resort Barranco de Veneguera, originally planned in 1999, Imagined Landscapes creates allows users to go on an impressionistic, watercolor-esque romp through the resort, and through the act of drawing itself. Each area of the resort begins just as an outline, with users able to take a virtual paintbrush up to fill in the sketches. “The more you let people use their hands, the more connected they’ll feel to the world around them,” said Runze Zhang, a Kilograph VR designer, in a press release. Barranco de Veneguera was meant to be a sprawling resort for 12,000 people running down a three-and-a-half mile valley all the way to the ocean. Graves imagined two greywater-irrigated golf courses as a green ribbon across the valley, a dense “town center” on the coast, and terraced hotels, all made to reflect and use the region’s topography; however, the resort never came to fruition. For the most part, Imagined Landscapes was developed in Unreal Engine 4, but the watercolor effect is a proprietary development by Kilograph. To get a natural look, the team layered elements like displacement maps, world position information, and post-processing effects together, creating a visual that mirrored Graves’s colors and style. Gesture controls were then created using Leap Motion, a hand-tracking hardware sensor, to produce an experience tailored to our natural instincts around movement and painting and to make the interaction feel more authentic. You can download Imagined Landscapes directly from Kilograph’s site or try it out October 2nd at the WUHO Gallery at Woodbury University in Hollywood, California.
Placeholder Alt Text

Apple and New Museum team up for choreographed urban AR art tours

New York's New Museum, which has already launched a fair share of tech-forward initiatives like net-art preservation and theorization platform Rhizome and NEW INC, has teamed up with Apple over the past year-and-a-half to create a new augmented reality (AR) program called [AR]T. New Museum director Lisa Phillips and artistic director Massimiliano Gioni selected artists Nick Cave, Nathalie Djurberg and Hans Berg, Cao Fei, John Giorno, Carsten Höller, and Pipilotti Rist to create new installations that display the artistic potential of AR and help advance the museum’s own mixed-reality strategy. Each of the artists will create interactive AR artworks that can be viewed via iPhones with the [AR]T app on “choreographed” street tours that will begin in a limited number of Apple stores across six cities. Users will be able to capture the mixed reality installations in photos and video through their phones. Additionally, Nick Cave has created an AR installation titled Amass that can be viewed in any Apple store, and the company has worked with artist and educator Sarah Rothberg to help develop programs to initiate beginners into developing their own AR experiences. This announcement comes on the heels of much industry AR and VR speculation regarding Apple, in part encouraged by recent hires from the gaming industry, like that of Xbox co-creator Nat Brown, previously a VR engineer at Valve. While some artists, institutions, and architects have embraced AR and VR, many remain skeptical of the technology, and not just on artistic grounds. Writing in the Observer, journalist Helen Holmes wonders if “Apple wants the public to engage with their augmented reality lab because they want to learn as much about their consumers as possible, including and especially how we express ourselves creatively when given new tools.” The [AR]T app will drop on August 10th in the following cities: New York, San Francisco, London, Paris, Hong Kong, and Tokyo
Placeholder Alt Text

How can new technologies make construction safer?

Construction remains one of the most dangerous careers in the United States. To stop accidents before they happen, construction companies are turning to emerging technologies to improve workplace safety—from virtual reality, drone photography, IoT-connected tools, and machine learning. That said, some solutions come with the looming specter of workplace surveillance in the name of safety, with all of the Black Mirror-esque possibilities. The Boston-based construction company Suffolk has turned to artificial intelligence to try and make construction safer. Suffolk has been collaborating with computer vision company Smartvid.io to create a digital watchdog of sorts that uses a deep-learning algorithm and workplace images to flag dangerous situations and workers engaging in hazardous behavior, like failing to wear safety equipment or working too close to machinery. Suffolk’s even managed to get some of their smaller competitors to join them in data sharing, a mutually beneficial arrangement since machine learning systems require so much example data; something that's harder for smaller operations to gather. Suffolk hopes to use this decade’s worth of aggregated information, as well as scheduling data, reports, and info from IoT sensors to create predictive algorithms that will help prevent injuries and accidents before they happen and increase productivity. Newer startups are also entering the AEC AI fray, including three supported by URBAN-X. The bi-coastal Versatile Natures is billing itself as the "world's first onsite data-provider," aiming to transform construction sites with sensors that allow managers to proactively make decisions. Buildstream is embedding equipment and construction machinery to make them communicative, and, by focusing on people instead, Contextere is claiming that their use of the IoT will connect different members of the workforce. At the Florida-based firm Haskell, instead of just using surveillance on the job site, they’re addressing the problem before construction workers even get into the field. While videos and quizzes are one way to train employees, Haskell saw the potential for interactive technologies to really boost employee training in a safe context, using virtual reality. In the search for VR systems that might suit their needs, Haskell discovered no extant solutions were well-suited to the particulars of construction. Along with their venture capital spinoff, Dysruptek, they partnered with software engineering and game design students at Kennesaw State University in Georgia to develop the Hazard Elimination/Risk Oversight program, or HERO, relying on software like Revit and Unity. The video game-like program places users into a job site, derived from images taken by drone and 360-degree cameras at a Florida wastewater treatment plant that Haskell built, and evaluates a trainee’s performance and ability to follow safety protocols in an ever-changing environment. At the Skanska USA, where 360-degree photography, laser scanning, drones, and even virtual reality are becoming increasingly commonplace, employees are realizing the potentials of these new technologies not just for improved efficiency and accuracy in design and construction, but for overall job site safety. Albert Zulps, Skanska’s Regional Director, Virtual Design and Construction, says that the tech goes beyond BIM and design uses, and actively helps avoid accidents. “Having models and being able to plan virtually and communicate is really important,” Zulps explained, noting that in AEC industries, BIM and models are now pretty much universally trusted, but the increased accuracy of capture technologies is making them even more accurate—adapting them to not just predictions, but the realities of the site. “For safety, you can use those models to really clearly plan your daily tasks. You build virtually before you actually build, and then foresee some of the things you might not have if you didn't have that luxury.” Like Suffolk, Skanska has partnered with Smartvid.io to help them process data. As technology continues to evolve, the ever-growing construction industry will hopefully be not just more cost-efficient, but safer overall.
Placeholder Alt Text

Architect creates app to change how exhibitions are designed

For all the advances in technology over the past decade, the experience of curating and viewing museum shows has remained relatively unchanged. Even though digital archive systems exist and have certainly helped bring old institutions into the present, they have relatively little influence over the ways museum shows are designed and shared. The normal practice is more or less “old school” and even borderline “dysfunctional,” said Bika Rebek, principal of the New York and Vienna–based firm Some Place Studio. In fact, a survey she conducted early on found that many of the different software suites that museum professionals were using were major time sinks for their jobs. Fifty percent said they felt they were “wasting time” trying to fill in data or prepare presentations for design teams. To Rebek, this is very much an architectural problem, or at least a problem architects can solve. She has been working over the past two years, supported by NEW INC and the Knight Foundation, to develop Tools for Show, an interactive web-based application for designing and exploring exhibitions at various scales—from the level of a vitrine to a multi-floor museum. Leveraging her experiences as an architect, 3D graphics expert, and exhibition designer (she’s worked on major shows for the Met and Met Breuer, including the OMA-led design for the 2016 Costume Institute exhibition Manus x Machina), Rebek began developing a web-based application to enable exhibition designers and curators to collaborate, and to empower new ways of engaging with cultural material for users anywhere. Currently, institutions use many different gallery tools, she explained, which don’t necessarily interact and don’t usually let curators think spatially in a straightforward way. Tools for Show allows users to import all sorts of information and metadata from existing collection management software (or enter it anew), which is attached to artworks stored in a library that can then be dragged and dropped into a 3D environment at scale. Paintings and simple 3D shapes are automatically generated, though, for more complex forms where the image projected onto a form of a similar footprint isn’t enough, users could create their own models.  For example, to produce the New Museum’s 2017 show Trigger: Gender as a Tool and a Weapon, Rebek rendered the space and included many of the basic furnishings unique to the museum. For other projects, like a test case with the Louvre's sculptures, she found free-to-use models and 3D scans online. Users can drag these objects across the 3D environments and access in-depth information about them with just a click. With quick visual results and Google Docs-style automatic updates for collaboration, Tools for Show could help not just replace more cumbersome content management systems, but endless emails too. Rebek sees Tools for Show as having many potential uses. It can be used to produce shows, allowing curators to collaboratively and easily design and re-design their exhibitions, and, after the show comes down it can serve as an archive. It can also be its own presentation system—not only allowing “visitors” from across the globe to see shows they might otherwise be unable to see, but also creating new interactive exhibitions or even just vitrines, something she’s been testing out with Miami’s Vizcaya Museum and Gardens. More than just making work easier for curators and designers, Tools for Show could possibly give a degree of curatorial power and play over to a broader audience. “[Tools for Show] could give all people the ability to curate their own show without any technical knowledge,” she explained. And, after all, you can't move around archival materials IRL, so why not on an iPad? While some of the curator-focused features of Tools for Show are in the testing phase, institutions can already request the new display tools like those shown at Vizcaya. Rebek, as a faculty member at Columbia University's Graduate School of Architecture, Planning, and Preservation, has also worked with students to use Tools for Show in conjunction with photogrammetry techniques in an effort to develop new display methods for otherwise inaccessible parts of the Intrepid Sea, Air, and Space Museum, a history and naval and aerospace museum located in a decommissioned aircraft carrier floating in the Hudson River. At a recent critique, museum curators were invited to see the students’ new proposals and explore the spatial visualizations of the museum through interactive 3D models, AR, VR, as well as in-browser and mobile tools that included all sorts of additional media and information.

Open Call: R+D for the Built Environment Design Fellowship

R+D for the Built Environment, is sponsoring a 6-month, paid, off-site design fellowship program starting this summer. We're looking for four candidates in key R+D topic areas:
  1. Building material science
  2. 3D printing, robotics, AR/VR
  3. AI, machine learning, analytics, building intelligence
  4. Quality housing at a lower cost
  5. Building resiliency and sustainability
  6. Workplace optimization
  7. Adaptable environments
We're excited to support up-and-coming designers, engineers, researchers (and all the disciplines in between!) advance their work and provide them with a platform to share their ideas. Follow the link below for more details and instructions on how to apply. Applications are due by May 31, 2019. https://sites.google.com/view/rdbe-design-fellowship-2019/home  
Placeholder Alt Text

Skanska puts 360-degree photography to work on New York job sites

Three-sixty-degree photography on construction sites is sort of like Google Street View at a smaller scale—a worker walks through a job site with a monopod or sometimes even with a helmet-mounted set of cameras and captures the sights and sounds at all angles. And the technology has become a boon for Skanska, especially for projects like the Moynihan Train Hall and LaGuardia Terminal B in New York. “The resolution is just phenomenal,” said Tony Colonna, senior vice president of innovative construction solutions at Skanska of the new photography techniques, which increasingly can be done with off-the-shelf consumer products. “You can basically take anyone on a walkthrough without being at the site.” The 360-degree video is almost like being there, he reports. “You're in complete control. You can stop, look around, look up, look down. So you're not limited let's say with traditional photographs or traditional video to just see maybe where the camera was pointing. With the 360 you have complete flexibility.” It’s helped teams collaborate more fluidly and accurately across cities. “We might run into some sort of challenge on a site, and hey, you know what, the expert's at the other side of the country,” Colonna explained. “You can bring them onto the site. We give them this kind of experience and have that engagement to help solve a problem.” “These photographs are game-changing," said Albert Zulps, regional director, virtual design, and construction at Skanska. “You capture that space and then later you can actually look at versions of those photographs, go back in time, peel back the sheetrock and go into the wall.” Three-sixty-degree photography can also offer a tremendous time savings and improve worksite safety, he said. The photos integrate well with other tech, including software like StructionSite and HoloBuilder as well as mobile apps that allow people to locate themselves within a floor plan while taking a 360-degree photograph. In addition, it plays well with other emerging technologies Skanska is using, including models generated from 3D laser scans, VR headsets, and tech for making mixed reality environments. “What we've started to do is take that footage, and take those pictures, and you overlay them with the model,” said Colonna. “If you really want to think about how everything ties together, it is all about collaboration,” Colonna said. “When you look at the construction industry, you're trying to effectively manage a lot of different entities, from the design team, to the owner, to the builder, to all the contractors. What Skanska is doing as a construction manager is finding new ways to collaborate with all those teams. It's really about, how do we use more visual technology to help us work better together?”
Placeholder Alt Text

Zaha Hadid Architects launches a collaborative VR experiment

Virtual reality is often an individual experience, with one user shaping and traversing a preprogrammed digital realm. But what if complete strangers could gather together within the virtual realm to construct an architectural edifice? Project Correl is an experiment by the Zaha Hadid Virtual Reality Group (ZHVR) in what it describes as “multi-presence virtual reality,” where users can collaborate on an ever-changing sculptural form. Project Correl is a feature of the larger Design as Second Nature exhibition found in Mexico City’s Museo Universitario Arte Contemporáneo (MUAC), which features photographs, paintings, models, and other mediums highlighting the firm’s in-house technological research. As part of the exhibition, Zaha Hadid Architects’ Computation and Design group collaborated with ETH Zurich to create a 13-foot-tall curved concrete shell formed with a 3-D-knitted framework, dubbed KnitCandela. The VR experiment allows up to four visitors at a time to roam a digital space and manipulate objects found within. The overarching objective for the participants is the collaborative construction of a virtual structure. Over the course of the three-month-long exhibition, successive waves of visitors will immerse themselves in the continually adapting construct, imprinting it with their own personal elements along the way. Collaboration between participants is encouraged by the program's software; components directly attached to the primary structure within the simulation will remain throughout the course of the experiment. Stand-alone objects must be connected to growing clusters of user-assembled edifices to last throughout the exhibition. Objects that remain solitary will be periodically eliminated from the virtual space. Throughout the running of Project Correl, ZHVR will capture the altering iterations of the experiment and 3-D-print models for display within the exhibition. To create and operate the virtual reality engine, ZHVR worked with Unreal Engine, HP Virtual Reality Solutions, and HTC VIVE. The latter is a novel virtual reality platform allowing for in-depth room-scale, real-time interactions. Project Correl and Design as Second Nature will be on display until March 3, 2019.
Placeholder Alt Text

New virtual reality program could transform historic preservation

In October 2018, Switzerland-based 3-D-graphics software company Imverse released a public beta version of its LiveMaker modeling tool. This powerful virtual reality interface allows for the transformation of 2-D inputs into immersive 3-D environments. While the use of VR in the field of architecture and design is by no means novel, it has primarily remained a tool for the final visualization of a project. LiveMaker not only allows the user to navigate and interact with spaces and objects within a rendered 3-D environment, but also facilitates the real-time manipulation of details such as geometry, color, and placement. Within the digitally rendered environment, specific details imported from 2-D images are easily replicated and moved about the space. The foundation of Imverse’s ability to create this malleable VR interface is its proprietary voxel-based gaming engine. According to Benoît Perrin, head of marketing and communications, “most 3-D graphics today are based on polygons that complicate what should be the seamless creation of content, LiveMaker is the first application of a voxel engine as a 3-D modeling tool.” One of the more impressive tools stemming from the use of a voxel engine is the dynamic shading and lighting characteristics applied to objects–the shadow cast by a column at any time of day is immediately available. How is the application most useful for architects and designers? The platform presents a number of positive implications for firms involved in historic restorations or reconfigurations of protected sites. For example, with 360-degree imaging of Austria’s Hellbrunn Palace, a user can interact with walls, columns, and other elements. If the user comes across a specific detail or object of interest, they can be copied and exported as 3-D models across different rendering platforms. Going forward, features within LiveMaker will be upgraded and expanded by Imverse following feedback from users of the public beta release.
Placeholder Alt Text

Artist Mel Chin addresses climate change with virtual reality underwater world in Times Square

This summer, visitors to Times Square can take in an underwater virtual reality experience, courtesy of the North Carolina-based conceptual artist Mel Chin and technology giant Microsoft. The site-specific mixed-reality public art installation titled Unmoored will come to life from July 11 through September 5. Chin tackles the topic of climate change, imagining a future where melting ice caps cause the city to go underwater. As visitors look through VR goggles, they can see a 'nautical traffic jam' of 3-D modeled ships, each with their own unique identification number and name. Ships move slowly in the city, bumping into each other and buildings, creating waves rendered by realistic animation and sound effects. Visitors can also view Wake, another public artwork by Chin, which “evokes the hull of a shipwreck crossed with the skeletal remains of marine mammals,” beside a sculpture of the 19th century opera singer Jenny Lind. Chin chose New York City as the site for the word because it represents the center of trade, entertainment, and capitalism for the country. Its history is loaded with topics that resonate today, like guns and slavery. The USS Nightingale, which was historically involved with the shipping of slaves, is digitally recreated in the Unmoored experience. The installations are part of an exhibition titled Mel Chin: All Over the Place, presented with the Queens Museum of Art and NYC-based, nonprofit arts organization No Longer Empty in various sites around New York. Times Square visitors can view the show through mobile devices via Unmoored’s mobile app, which is now available for download and use. Check this link for more details.