How AR Will Help Make the Invisible Visible

How AR Will Help Make the Invisible Visible

How AR Will Help Make the Invisible Visible

Augmented Reality, or AR for short, will change how we experience, measure, interpret and understand our world and also how we interact with it.
AR will certainly challenge us – sometimes less, sometimes more significantly – but it also will bring with it a great value potential and an essential set of tools for our societies to master progress in various areas – may it be in an increasingly digitally driven industry, about infrastructure, mobility and logistics, urban planning, design, and governance, environmental and regenerative sustainability, establishing closed loops for a circular economy, about knowledge transfer, education, and learning, communication in general, tourism, socialising, as well as societal participation, and probably a lot more. And not to forget the right now seriously globally tested healthcare sector.

LIDAR Sensors Used for Tablet Devices as a Test Case for a Next Generation of Augmented Reality Solutions

On 20 March, 2020, WIRED Magazine speculated about the latest iPad Pro’s LIDAR sensor mainly being a test case for a wholly new category of products to be offered by Apple within the next years – Augmented Reality (AR) glasses.
WIRED author Brian Barrett, like authors of other publications, grounded his view in recent releases of iOS and Xcode or one of Apple’s more recent patents. And while not all technological developments, patented or not, necessarily have to lead to actual products available to consumers so soon or at all, AR as much as Mixed Reality (MR) or Extended Reality (XR) devices in general make much sense and will most probably see their breakthrough rather sooner than later. They will offer never before seen possibilities of perception of and exchange with one’s surroundings and also massively benefit an increasingly digitally interconnected, interactive world – at least for those who will be willing to make use of it and understand how.

Sceptics might point out how Google Glass did not really come to life, other approaches like Microsoft’s Mixed Reality device HoloLens only relatively slowly moving forward, and the once hyped company Magic Leap not being able to meet or even exceed way too high expectations.
Nevertheless, AR is already in use, and e.g. in manufacturing, companies experiment with first applications, eagerly waiting for the next technological upgrades and new possibilities enabled based on them.

Developments in Virtual Reality Being an Indicator?

On the other hand, after having been material of a whole decades old subgenre of science-fiction, namely Cyberpunk, Virtual Reality (VR) witnessed one revolutionary announcement after another, just to be followed by fascinating and yet still rather modest achievements.
However, those developments are definitely not to be underestimated, and most importantly, while indeed slowly, do in fact move forward unstoppably and will have built sufficient  momentum eventually. Digital artist Goro Fujita’s work made with Oculus Rift and a software called Quill gives a very good idea of what great ways of what kind of visual and spatial experiences to expect in the cyberspace. Have a look at some of his animated 3D drawings that were made by drawing in virtual air and give an idea about what VR (and with it AR and XR) might have to offer to us in the not so far future.

Some Examples of Possible Augmented Reality Use Cases

The scenarios following were only some of the more obvious ones that AR will allow users to create, and some, if not all of them, are also already in the making somewhere. One might read about them in tech articles, hear people (the author of this article included) daydream about them, or experience prototypes at fairs, conferences, tech summits, etc.

An Urban Infrastructure Information Layer

Picture yourself walking in the streets of some megacity like e.g. Shanghai, possibly in Pudong district, not sure where and when to catch the next metro train but being under immense time pressure due to an important appointment to take in thirty minutes. You activate your AR glasses and suddenly see all needed information right in front of your eyes, as an overlay that extends the real, physical world by what otherwise would be invisible to human sight.

You see in which direction the next metro station is to be found, where the entrance. But you also see the different trains actually approaching and leaving the station, as if you were looking right through the ground under your feet; additionally the next arrivals and departures on schedule, counting down, how long it will take you to arrive at the right platform, which car will be closest to the exit to take at the arrival station to get to the place of your appointment fastest. Possibly, you will be able to directly book the proper ticket, too, by simply touching your spectacle frame swiftly two times when you selected the correct connection, thanks to eye tracking. 

Of course such information layers would also work for all other parts of the urban infrastructure on your way around the city and tell about the nearest local government offices and their service times, stores and their products in stock, restaurants and their culinary offerings, hotels and vacant rooms, taxi or bus stations and fairs or ticket options, and much more.

An Augmented Operations Guide and Manual in Manufacturing

You are new to your job in a factory of an automotive company as well as to their process setups. Maybe you did not yet work with those specific robot models in use at the production line. Of course, you will also get the needed theoretical and practical training. But with your wearable AR device you get much more than that, as you will be guided by visual overlays, manuals that blend in next to the machine you want to operate. Step by step it shows you what to do and what to be careful about, also marking the areas around machines that you have to avoid while they are in service.
The Augmented Operations Guide also demonstrates how to maintain the robots, change tools and program them via the smart pad right in front of you. And once your shift ends, you dictate your report right into the documentation system.

A Museums Guide via Augmented Reality

When you go to a museum (or alternatively an art gallery) the AR device tells you all about what you see in the exhibition, provides you with everything available to know about a Chinese statue, Roman tomb stone, French painting, or any other cultural goods in front of you. You will learn about exciting background stories, vividly told, about scientific methods and the people who brought light into the dark around the exhibits over the course of history.
LIDAR technology – in combination with Artificial Intelligence and pattern recognition, as well as state-of-the-art 3D animation techniques – might be good for some pretty spectacular visual effects, as based on it a distinction can be made between background, foreground and what lies in between, for getting dimensions and perspective and inserting all kinds of visual input correctly. Possibly, scattered  earthenware pieces on the ground will reconnect and suddenly a very alive looking Terracotta Army soldier marches around the corner right towards you, stopping closely in front of you, saluting and reporting about his and his comrades’ creation and purpose.

Never before should visits to any museum, art gallery or similar place have been so amazing and interesting.

An AR Guide for Tourists, Discoverers, and Adventurers

Last time you travelled and visited some place you might have come by some interesting looking buildings and asked yourself about when they were built, by whom, what influenced the architects in terms of style and construction methods and whom they influenced in return afterwards, what exactly the challenges were to make the concepts become reality, and so forth. Unfortunately, you had no tour guide by your side and those buildings were all out of sight too quick anyways, as the bus you were sitting in did not stop. Or you were on a hiking tour in the mountains and wanted to know the names of all those beautiful peaks nearby, their heights, what geological event formed them, what species populate them, etc.
Augmented Reality already helps with that today to some degree but it will do a much better job in the future, even more so when the device to deliver the information you want sits right on your nose and automatically unveils endless knowledge to answer every question you might have.

A Guide to Nature

Because of Augmented Reality and Artificial Intelligence you will never again ask yourself what that beautiful and also a bit rude bird in the park is called – this little Noisy Miner, or Manorina melanocephala for ornithologists.
But this specific exemplar does not want to reveal its typical sound to you when you are nearby no matter how much you would like to hear it? Your AR device coupled with your headphones can play it out for you. Lastly, you might also quickly want to take a picture, so just focus on the bird and blink twice. Again, thanks to eye tracking the job will be done right away, just a matter of a second before the bird flies away and into the next bushes.

More Will be Needed Than Sensors and Smart Devices to Make AR that Good

There will be a lot more use cases than described above but no matter how good any AR device might be and how sophisticated the sensors built in, they will need a complex system underneath, a kind of framework, to wire various complementary pieces of technology on the one hand and on the other unifying a lot of different information sources, databases, applications, and digital services. It will be key to generate a seamless, compelling and clear digital user experience and customer journey. Depending on the scenario, you might be confronted with  including and welding together data points from ERP (Enterprise Resource Planning), CRM (Customer Relationship Management), BIM (Building Information Modeling), educational and learning solutions, content, document and media asset management systems, graphics and rendering engines, of course some Artificial Intelligence, and much more.

And of course, you also need to overcome still closed silos, make people of different but complementary backgrounds, disciplines, functions, departments, and institutions collaborate. Also this will be impossible without enabling processes and tools – like employee, customer, or partner portals, which at some point might then incorporate the very benefits of Augmented or Extended Reality described above.

Please feel free to contact me to learn more about digital experiences, platform-based collaboration, and to discuss your project you might be into yourself at the moment.

How to Prevent Autonomous Vehicles Deciding About Life and Death?

How to Prevent Autonomous Vehicles Deciding About Life and Death?

How to Prevent Autonomous Vehicles Deciding About Life and Death?

Or: Smart Urban Infrastructure for Smart Mobility & Logistics

 

Vehicles to Decide on Whom to Hit and Whom to Spare

In expert interviews, conference talks, or articles like this one by MIT’s Technology Review magazine the necessity of autonomous cars to decide about life or death in a matter of seconds have been described repeatedly.

Picture the following situation: An autonomous vehicle drives along a street and all of a sudden it is confronted with people crossing right in front of it – an old grandmother on the one side and a group of children on the other. Of course it could also be two seniors and a small family of three, a nurse getting to the hospital in his or her work outfit and a person possibly being a banker, in business attire, equipped with a suitcase and talking into a smartphone.
Because of the people crossing the street right after a curve, and buildings as well as other vehicles blocking the view, the car’s or truck’s sensors were not able to trigger the braking process beforehand. It is too fast and therefore not able to come to a stop in time, maybe rainy weather is also causing the street being more slippery this day. Lastly, there are two options only for the vessel to react, hitting the grandmother, the seniors, or the nurse; or hitting the children, the family of three, or the banker. How shall the Artificial Intelligence inside the vehicle decide, whose life is probably more valuable and therefore to spare?

This classic scenario is taken as an explanation and warning for Autonomous Driving first of all being in need of ethical guidelines and clear procedures, allowing the vessel to come to a “right” decision, whatever this might be – and should it even exist in the first place, which the author of this article highly doubts.

Of course this story is speaking to us and appears to make a lot of sense. It tells about what one can experience in everyday traffic situations on a regular basis. Indeed, people steering vehicles need to react to people or also animals suddenly crossing the street – even if most of the times probably not have to choose between hitting either kids or seniors; or it could be about avoiding being hit by objects falling from trucks driving ahead. Our human senses limit our ability to really think through whom or what to hit or not under such circumstances, though. But today’s technology would be fast enough and able to take over, wouldn’t it?
And yet, is this really the right story to tell and the right or at least most pressing question to ask? Why is it that Autonomous Driving is so often described as vehicles stuffed with sensors, fully independently observing the traffic situation surrounding them and finding proper measures of response single-handedly?

No Need for Vehicles to Decide or to Harm at All

Fact is that other technologies and approaches do exist and are improved constantly which makes a scenario as described above rather unlikely. With sensors of all kinds becoming cheaper by the day and even so reasonably priced, utilising them on a very large scale would not only be economically feasible but also advantageous next to other solutions.
However, with sensors being ubiquitous, vehicles deciding autonomously on life and death will become obsolete before they even really conquered the market. Additionally, situations of people being hit by vehicles would most probably also become ones of the past.

An Alternative and Preferable Traffic Scenario

Beginning in the urban centres of today and increasingly so in those of tomorrow, urban infrastructure will make cities, towns, down to villages, and everything between and connecting them a lot smarter and also highly responsive. Streets will measure pressure and weight of vessels on them, they will sense temperature, humidity and motion, they can generate needed energy and act as information displays. Cameras, radar and lidar systems and advanced traffic lights at crossings or other critical spots help predicting vehicles’ and other traffic participants’ behaviours and cover blind spots.
There are growing satellite constellations of various sizes in Earth’s orbits which track the weather, establish communication streams, provide navigation data and via remote sensing guarantee an extensive while impressively detailed overview, with improving resolutions and increasingly also in real-time. And last but not least, vehicles themselves add their own sensory data, they communicate with each other (“V2V”) and with the infrastructure around (“V2X”).

Everything interconnected in an overall traffic network will assure that each and every road user will be seen and taken into account at any given time, as well as in every light and weather situation, without the need of touching any privacy issues, when done right. A system like this would have learned about the possibility of people or animals crossing the street long before any vehicle even got close. Every single vessel would be informed way ahead of actual decisions having to be made about different response options that avoid accidents, harm, and damages of any kind.

Interdisciplinary and Cross-Sectoral Collaboration Needed

Then what would be needed to design, plan and realise such a vastly complex traffic environment? It is the very same approach that many industries and fields have been asking for more often and louder in recent years:

“Silos” have to be opened up inside of organisations and beyond, they and all people involved and competencies relevant need to collaborate. No matter if natural partners, competitors, or entities which did not even know about each other before, the probability is very high that each of them have something to learn from another and to teach in return. They might have common goals or at least ones being related to some degree.
It is city administrations and urban planners, architects, automotive companies, various suppliers from technical industries, socio-political and socio-economic, and environmental stakes, and a lot more to be brought together. In the era of the ‘Industrial Internet of Things’ or ‘Industry 4.0’ IT-based networks, platforms, and tools are the means to empower this kind of interdisciplinary and cross-sectoral collaboration.

Get in touch for more information