Placeholder Content Image

Tactile robot with a sense of touch can fold laundry

<p>Why can you buy a robot vacuum cleaner easily, but not one that folds laundry or irons clothes? Because fabric is actually a very difficult thing for robots to manipulate. But scientists have made a breakthrough with a robot designed to have tactile senses.</p> <p>Fabric is soft, and deformable, and requires a few different senses firing to pick up. This is why the fashion industry is so <a href="https://cosmosmagazine.com/people/garment-supply-chain-slavery/" target="_blank" rel="noreferrer noopener">labour-intensive</a>: it’s too hard to automate.</p> <p>“Humans look at something, we reach for it, then we use touch to make sure that we’re in the right position to grab it,” says David Held, an assistant professor in the School of Computer Science, and head of the Robots Perceiving and Doing Lab, at Carnegie Mellon University, US.</p> <p>“A lot of the tactile sensing humans do is natural to us. We don’t think that much about it, so we don’t realise how valuable it is.”</p> <p>When we’re picking up a shirt, for instance, we’re feeling the top layer, sensing lower layers of cloth, and grasping the layers below.</p> <p>But even with cameras and simple sensors, robots can usually only feel the top layer.</p> <p>But Held and colleagues have figured out how to get a robot to do more. “Maybe what we need is tactile sensing,” says Held.</p> <p>The Carnegie Mellon researchers, along with Meta AI, have developed a robotic ‘skin’ called <a href="https://ai.facebook.com/blog/reskin-a-versatile-replaceable-low-cost-skin-for-ai-research-on-tactile-perception/" target="_blank" rel="noreferrer noopener">ReSkin</a>.</p> <p>It’s an elastic <a href="https://cosmosmagazine.com/science/explainer-what-is-a-polymer/" target="_blank" rel="noreferrer noopener">polymer</a>, filled with tiny magnetic sensors.</p> <div class="newsletter-box"> <div id="wpcf7-f6-p220637-o1" class="wpcf7" dir="ltr" lang="en-US" role="form"> </div> </div> <p>“By reading the changes in the magnetic fields from depressions or movement of the skin, we can achieve tactile sensing,” says Thomas Weng, a Ph.D. student in Held’s lab, and a collaborator on the project.</p> <p>“We can use this tactile sensing to determine how many layers of cloth we’ve picked up by pinching, with the sensor.”</p> <p>The ReSkin-coated robot finger could successfully pick up both one and two layers of cloth from a pile, working with a range of different textures and colours.</p> <p>“The profile of this sensor is so small, we were able to do this very fine task, inserting it between cloth layers, which we can’t do with other sensors, particularly optical-based sensors,” says Weng.</p> <p>“We were able to put it to use to do tasks that were not achievable before.”</p> <p>The robot is not yet capable of doing your laundry: next on the researchers list is teaching it to smooth crumpled fabric, choosing the correct number of layers to fold, then folding in the right direction.</p> <p>“It really is an exploration of what we can do with this new sensor,” says Weng.</p> <p>“We’re exploring how to get robots to feel with this magnetic skin for things that are soft, and exploring simple strategies to manipulate cloth that we’ll need for robots to eventually be able to do our laundry.”</p> <p>The researchers are presenting a <a href="https://sites.google.com/view/reskin-cloth" target="_blank" rel="noreferrer noopener">paper</a> on their laundry-folding robot at the 2022 International Conference on Intelligent Robots and Systems in Kyoto, Japan.</p> <p><img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=220637&amp;title=Tactile+robot+with+a+sense+of+touch+can+fold+laundry" width="1" height="1" /></p> <div id="contributors"> <p><em><a href="https://cosmosmagazine.com/technology/laundry-folding-robot/" target="_blank" rel="noopener">This article</a> was originally published on Cosmos Magazine and was written by Ellen Phiddian. </em></p> <p><em>Image: </em><em>Carnegie Mellon University</em></p> </div>

Technology

Placeholder Content Image

Realistic androids coming closer, as scientists teach a robot to share your laughter

<p>Do you ever laugh at an inappropriate moment?</p> <p>A team of Japanese researchers has taught a robot when to laugh in social situations, which is a major step towards creating an android that will be “like a friend.”</p> <p>“We think that one of the important functions of conversational AI is empathy,” says Dr Koji Inoue, an assistant professor at Kyoto University’s Graduate School of Informatics, and lead author on a paper describing the research, <a href="https://doi.org/10.3389/frobt.2022.933261" target="_blank" rel="noreferrer noopener">published</a> in <em>Frontiers in Robotics and AI</em>.</p> <p>“Conversation is, of course, multimodal, not just responding correctly. So we decided that one way a robot can empathize with users is to share their laughter, which you cannot do with a text-based chatbot.”</p> <p>The researchers trained an AI with data from 80 speed dating dialogues, from a matchmaking marathon with Kyoto University students. (Imagine meeting a future partner at exercise designed to teach a robot to laugh…)</p> <p>“Our biggest challenge in this work was identifying the actual cases of shared laughter, which isn’t easy, because as you know, most laughter is actually not shared at all,” says Inoue.</p> <p>“We had to carefully categorise exactly which laughs we could use for our analysis and not just assume that any laugh can be responded to.”</p> <p>They then added this system to a hyper-realistic android named <a href="https://robots.ieee.org/robots/erica/" target="_blank" rel="noreferrer noopener">Erica</a>, and tested the robot on 132 volunteers.</p> <div class="newsletter-box"> <div id="wpcf7-f6-p214084-o1" class="wpcf7" dir="ltr" lang="en-US" role="form"> </div> </div> <p>Participants listened to one of three different types of dialogue with Erica: one where she was using the shared laughter system, one where she didn’t laugh at all, and one where she always laughed whenever she heard someone else do it.</p> <p>They then gave the interaction scores for empathy, naturalness, similarity to humans, and understanding.</p> <p>The researchers found that the shared-laughter system scored higher than either baseline.</p> <p>While they’re pleased with this result, the researchers say that their system is still quite rudimentary: they need to categorise and examine lots of other types of laughter before Erica’s chuckling naturally.</p> <p>“There are many other laughing functions and types which need to be considered, and this is not an easy task. We haven’t even attempted to model unshared laughs even though they are the most common,” says Inoue.</p> <p>Plus, it doesn’t matter how realistic a robot’s laugh is if the rest of its conversation is unnatural.</p> <p>“Robots should actually have a distinct character, and we think that they can show this through their conversational behaviours, such as laughing, eye gaze, gestures and speaking style,” says Inoue.</p> <p>“We do not think this is an easy problem at all, and it may well take more than 10 to 20 years before we can finally have a casual chat with a robot like we would with a friend.”</p> <p><img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=214084&amp;title=Realistic+androids+coming+closer%2C+as+scientists+teach+a+robot+to+share+your+laughter" width="1" height="1" /></p> <div id="contributors"> <p><em><a href="https://cosmosmagazine.com/technology/robot-laugh/" target="_blank" rel="noopener">This article</a> was originally published on <a href="https://cosmosmagazine.com" target="_blank" rel="noopener">Cosmos Magazine</a> and was written by <a href="https://cosmosmagazine.com/contributor/ellen-phiddian" target="_blank" rel="noopener">Ellen Phiddian</a>. Ellen Phiddian is a science journalist at Cosmos. She has a BSc (Honours) in chemistry and science communication, and an MSc in science communication, both from the Australian National University.</em></p> <p><em>Image: Getty Images</em></p> </div>

Technology

Placeholder Content Image

Supermarket delivery by robot better for the climate

<p>Along with their <a href="https://twitter.com/historymatt/status/1525776275939418113" target="_blank" rel="noreferrer noopener">cult following on social media</a>, autonomous delivery robots travelling on footpaths could be the most climate-friendly way to do your grocery shopping.</p> <p>Around the world, <a href="https://cosmosmagazine.com/people/will-covid-19-change-our-cities/" target="_blank" rel="noreferrer noopener">COVID-19 has seen a change</a> in the way people shop for groceries. Instead of driving to the supermarket more people are ordering online for pick-up or home delivery, and even in some places, delivery <a href="https://cosmosmagazine.com/technology/robotics/drone-delivery-groceries-canberra/" target="_blank" rel="noreferrer noopener">by drone</a> or robot.</p> <p>In the United States supermarket home delivery services grew 54% between 2019 and 2020. In Australia, Woolworths and Coles experienced <a href="https://theconversation.com/coles-and-woolworths-are-moving-to-robot-warehouses-and-on-demand-labour-as-home-deliveries-soar-166556" target="_blank" rel="noreferrer noopener">unprecedented demand.</a></p> <p>The rapid growth in e-commerce has seen an increased focus on the greenhouse gas emissions associated with the <a href="https://cosmosmagazine.com/earth/sustainability/to-help-the-environment-should-you-shop-in-store-or-online/" target="_blank" rel="noreferrer noopener">‘last-mile’ delivery</a>.</p> <p>A study by University of Michigan researchers and the Ford Motor Co modelled the emissions associated with the journey of a 36-item grocery basket from shop to home via a number of alternative transport options. Their study is <a href="https://pubs.acs.org/doi/pdf/10.1021/acs.est.2c02050" target="_blank" rel="noreferrer noopener">published</a> in the journal <em>Environmental Science &amp; Technology</em>.</p> <p>“This research lays the groundwork for understanding the impact of e-commerce on greenhouse gas emissions produced by the grocery supply chain,” says the study’s senior author Greg Keoleian<a href="https://seas.umich.edu/research/faculty/greg-keoleian" target="_blank" rel="noopener">,</a> director of the Centre for Sustainable Systems at University of Michigan School for Environment and Sustainability.</p> <p>The researchers modelled 72 different ways the groceries could travel from the warehouse to the customer. Across all options, the results showed ‘last-mile’ transport emissions to be the major source of <a href="https://cosmosmagazine.com/earth/food-transport-emissions-cost/">supply chain emissions</a>.</p> <div class="newsletter-box"> <div id="wpcf7-f6-p201307-o1" class="wpcf7" dir="ltr" lang="en-US" role="form"> </div> </div> <p>They found the conventional option of driving to the supermarket in a petrol or diesel car to be the most polluting, creating six kilograms of carbon dioxide (CO<sub>2</sub>). All other choices had lower emissions, with footpath delivery robots the cleanest for the climate, at one kg CO<sub>2</sub>.</p> <p>A customer who switched to an electric vehicle could halve their emissions. But they could achieve a similar impact on emissions by reducing their shopping frequency. Without buying a new car, households who halved the frequency of supermarket trips reduced emissions by 44%.</p> <p>Keoleian says the study emphasises the “important role consumers can serve in reducing emissions through the use of trip chaining and by making carefully planned grocery orders.” Trip chaining refers to combining grocery shopping with other errands.</p> <p>All home delivery options had lower emissions than in-store shopping – in part due to the efficiencies gained in store operation and transport – with the potential to cut emissions by 22 – 65%.</p> <p>Footpath robots are being trialled in cities across the United States, Europe and China. These four or six wheeled robots carry items like supermarket shopping or retail items over short distances. Most have a delivery range around three kilometres.</p> <figure class="wp-block-embed is-type-rich is-provider-twitter wp-block-embed-twitter"> <div class="wp-block-embed__wrapper"> <div class="entry-content-asset"> <div class="embed-wrapper"> <div class="inner"> <div class="twitter-tweet twitter-tweet-rendered spai-bg-prepared" style="display: flex; max-width: 500px; width: 100%; margin-top: 10px; margin-bottom: 10px;"><iframe id="twitter-widget-0" class="spai-bg-prepared" style="display: block; position: static; visibility: visible; width: 500px; height: 612px; flex-grow: 1;" title="Twitter Tweet" src="https://platform.twitter.com/embed/Tweet.html?creatorScreenName=CosmosMagazine&amp;dnt=true&amp;embedId=twitter-widget-0&amp;features=eyJ0ZndfdHdlZXRfZWRpdF9iYWNrZW5kIjp7ImJ1Y2tldCI6Im9uIiwidmVyc2lvbiI6bnVsbH0sInRmd19yZWZzcmNfc2Vzc2lvbiI6eyJidWNrZXQiOiJvbiIsInZlcnNpb24iOm51bGx9LCJ0ZndfdHdlZXRfcmVzdWx0X21pZ3JhdGlvbl8xMzk3OSI6eyJidWNrZXQiOiJ0d2VldF9yZXN1bHQiLCJ2ZXJzaW9uIjpudWxsfSwidGZ3X3NlbnNpdGl2ZV9tZWRpYV9pbnRlcnN0aXRpYWxfMTM5NjMiOnsiYnVja2V0IjoiaW50ZXJzdGl0aWFsIiwidmVyc2lvbiI6bnVsbH0sInRmd19leHBlcmltZW50c19jb29raWVfZXhwaXJhdGlvbiI6eyJidWNrZXQiOjEyMDk2MDAsInZlcnNpb24iOm51bGx9LCJ0ZndfZHVwbGljYXRlX3NjcmliZXNfdG9fc2V0dGluZ3MiOnsiYnVja2V0Ijoib2ZmIiwidmVyc2lvbiI6bnVsbH0sInRmd190d2VldF9lZGl0X2Zyb250ZW5kIjp7ImJ1Y2tldCI6Im9mZiIsInZlcnNpb24iOm51bGx9fQ%3D%3D&amp;frame=false&amp;hideCard=false&amp;hideThread=false&amp;id=1525776275939418113&amp;lang=en&amp;origin=https%3A%2F%2Fcosmosmagazine.com%2Fearth%2Fclimate%2Frobot-delivery-better-for-the-climate%2F&amp;sessionId=84ec360f0f0db6f38136f997db6585736d09d60a&amp;siteScreenName=CosmosMagazine&amp;theme=light&amp;widgetsVersion=b7df0f50e1ec1%3A1659558317797&amp;width=500px" frameborder="0" scrolling="no" allowfullscreen="allowfullscreen" data-tweet-id="1525776275939418113"></iframe></div> </div> </div> </div> </div> </figure> <p><a>Starship robots</a> is one example. Since launching in 2014, their robots have completed three million autonomous home deliveries in cities across Estonia, the United Kingdom, Finland and the United States.</p> <p><img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=201307&amp;title=Supermarket+delivery+by+robot+better+for+the+climate" width="1" height="1" /></p> <div id="contributors"> <p><em><a href="https://cosmosmagazine.com/earth/climate/robot-delivery-better-for-the-climate/" target="_blank" rel="noopener">This article</a> was originally published on <a href="https://cosmosmagazine.com" target="_blank" rel="noopener">Cosmos Magazine</a> and was written by <a href="https://cosmosmagazine.com/contributor/petra-stock" target="_blank" rel="noopener">Petra Stock</a>. Petra Stock has a degree in environmental engineering and a Masters in Journalism from University of Melbourne. She has previously worked as a climate and energy analyst.</em></p> <p><em>Image: Getty Images</em></p> </div>

Technology

Placeholder Content Image

Patch me up, Scotty! Remote surgery robot destined for ISS

<p>Strap yourself in so you don’t float away, select the required procedure, lie back and relax as your autonomous surgery robot patches you up from whatever space ailment bothers you. Sound far-fetched?</p> <p>Not according to Professor Shane Farritor, from the University of Nebraska-Lincoln, who <a href="https://news.unl.edu/newsrooms/today/article/husker-developed-surgery-robot-to-be-tested-aboard-international-space/" target="_blank" rel="noreferrer noopener">has just received funding from NASA</a> to prepare his miniature surgical robot for a voyage to the International Space Station (ISS) in 2024.</p> <p>MIRA, which stands for “miniaturised in vivo robotic assistant” is comparatively little for a surgery-performing machine – small enough to fit inside a microwave-sized experimental locker within the ISS. The brainchild of Farritor and colleagues at the start-up company Virtual Incision, MIRA has been under development for almost 20 years.</p> <p>The ultimate aim for MIRA is to be able to perform surgery autonomously and remotely, which has far-reaching ramifications for urgent surgery in the field – whether that’s in the depths of space, a remote location or even <a href="http://bionics.seas.ucla.edu/publications/JP_11.pdf" target="_blank" rel="noreferrer noopener">in a war-torn region</a>.</p> <p>Initially MIRA won’t go near anyone’s body. Once on the ISS, it will autonomously perform tasks designed to mimic the movements required for surgery, such as cutting stretched rubber bands and pushing metal rings along a wire.</p> <div class="newsletter-box"> <div id="wpcf7-f6-p200559-o1" class="wpcf7" dir="ltr" lang="en-US" role="form"> </div> </div> <p>Being autonomous is important as it won’t need to access bandwidth to communicate back to Earth.</p> <p>MIRA has already successfully completed surgery-like tasks via remote operation including a colon resection.</p> <p>Space is the next frontier.</p> <p>Farritor says, as people go further and deeper into space, they might need surgery. “We’re working toward that goal.”</p> <p>The stint on the ISS will not only mark the most autonomous operation so far, but it will also provide insight into how such devices might function in zero gravity.</p> <p>The dream goal is for MIRA to function entirely on its own, says Farritor. Just imagine: “the astronaut flips a switch, the process starts, and the robot does its work by itself. Two hours later, the astronaut switches it off and it’s done”.</p> <p>As anyone who has seen the scene in the movie, <a href="https://www.youtube.com/watch?v=Ue4PCI0NamI" target="_blank" rel="noreferrer noopener">The Martian</a>, can attest, it would certainly make pulling a wayward antenna spike out of yourself from within a deserted Martian habitat station far more comfortable.</p> <p><img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=200559&amp;title=Patch+me+up%2C+Scotty%21+Remote+surgery+robot+destined+for+ISS" width="1" height="1" /></p> <div id="contributors"> <p><em><a href="https://cosmosmagazine.com/health/remote-surgery-robot-destined-for-iss/" target="_blank" rel="noopener">This article</a> was originally published on <a href="https://cosmosmagazine.com" target="_blank" rel="noopener">Cosmos Magazine</a> and was written by <a href="https://cosmosmagazine.com/contributor/clare-kenyon" target="_blank" rel="noopener">Clare Kenyon</a>. Clare Kenyon is a science writer for Cosmos. She is currently wrangling the death throes of her PhD in astrophysics, has a Masters in astronomy and another in education, and has classroom experience teaching high school science, maths and physics. Clare also has diplomas in music and criminology and a graduate certificate of leadership and learning.</em></p> <p><em>Image: Getty Images</em></p> </div>

Technology

Placeholder Content Image

A robot dog with a virtual spinal cord can learn to walk in just one hour

<p>We’ve all seen those adorable clips of newborn giraffes or foals first learning to walk on their shaky legs, stumbling around until they finally master the movements.</p> <p>Researchers wanted to know how animals learn to walk and learn from their stumbling, so they built a four-legged, dog-sized robot to simulate it, according to a new study <a href="https://www.nature.com/articles/s42256-022-00505-4" target="_blank" rel="noreferrer noopener">reported</a> in <em>Nature Machine Intelligence</em>.</p> <p>They found that it took their robot and its virtual spinal cord just an hour to get its walking under control.</p> <p>Getting up and going quickly is essential in the animal kingdom to avoid predators, but learning how to co-ordinate leg muscles and tendons takes time.</p> <p>Initially, baby animals rely heavily on hard-wired spinal cord reflexes to co-ordinate muscle and tendon control, while motor control reflexes help them to avoid falling and hurting themselves during their first attempts.</p> <p>More precise muscle control must be practised until the nervous system adapts to the muscles and tendons, and the young are then able to keep up with the adults.</p> <p>“As engineers and roboticists, we sought the answer by building a robot that features reflexes just like an animal and learns from mistakes,” says first author Dr Felix Ruppert, a former doctoral student in the Dynamic Locomotion research group at the Max Planck Institute for Intelligent Systems (MPI-IS), Germany.</p> <p>“If an animal stumbles, is that a mistake? Not if it happens once. But if it stumbles frequently, it gives us a measure of how well the robot walks.”</p> <figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"> <div class="wp-block-embed__wrapper"> <div class="entry-content-asset"> <div class="embed-wrapper"> <div class="inner"><iframe title="Learning Plastic Matching of Robot Dynamics in Closed-loop Central Pattern Generators" src="https://www.youtube.com/embed/LPL6nvs_GEc?feature=oembed" width="500" height="281" frameborder="0" allowfullscreen="allowfullscreen"></iframe></div> </div> </div> </div> </figure> <p><strong>Building a virtual spinal cord to learn how to walk</strong></p> <p>The researchers designed a <a href="https://cosmosmagazine.com/health/machine-learning-tool-brain-injury/" target="_blank" rel="noreferrer noopener">learning algorithm</a> to function as the robot’s spinal cord and work as what’s known as a Central Pattern Generator (CPG). In humans and animals, the CPGs are networks of neurons in the spinal cord that, without any input from the brain, produce periodic muscle contractions.</p> <p>These are important for rhythmic tasks like breathing, blinking, digestion and walking.</p> <div class="newsletter-box"> <div id="wpcf7-f6-p198628-o1" class="wpcf7" dir="ltr" lang="en-US" role="form"> </div> </div> <p>The CPG was simulated on a small and lightweight computer that controlled the motion of the robot’s legs and it was positioned on the robot where the head would be on a dog.</p> <p>The robot – which the researchers named Morti – was designed with sensors on its feet to measure information about its movement.</p> <p>Morti learnt to walk while having no prior explicit “knowledge” of its leg design, motors, or springs by continuously comparing the expected data (modelled from the virtual spinal cord) against the sensor data as it attempted to walk.</p> <p> “Our robot is practically ‘born’ knowing nothing about its leg anatomy or how they work,” Ruppert explains. “The CPG resembles a built-in automatic walking intelligence that nature provides and that we have transferred to the robot. The computer produces signals that control the legs’ motors and the robot initially walks and stumbles.</p> <p>“Data flows back from the sensors to the virtual spinal cord where sensor and CPG data are compared. If the sensor data does not match the expected data, the learning algorithm changes the walking behaviour until the robot walks well and without stumbling.”</p> <p>Sensor data from the robot’s feet are continuously compared with the expected touch-down data predicted by the robot’s CPG. If the robot stumbles, the learning algorithm changes how far the legs swing back and forth, how fast the legs swing, and how long a leg is on the ground.</p> <p>“Changing the CPG output while keeping reflexes active and monitoring the robot stumbling is a core part of the learning process,” Ruppert says.</p> <p>Within one hour, Morti can go from stumbling around like a newborn animal to walking, optimising its movement patterns faster than an animal and increasing its energy efficiency by 40%.</p> <p>“We can’t easily research the spinal cord of a living animal. But we can model one in the robot,” says co-author Dr Alexander Badri-Spröwitz, head of the Dynamic Locomotion research group.</p> <p>“We know that these CPGs exist in many animals. We know that reflexes are embedded; but how can we combine both so that animals learn movements with reflexes and CPGs?</p> <p>“This is fundamental research at the intersection between robotics and biology. The robotic model gives us answers to questions that biology alone can’t answer.”</p> <p><img id="cosmos-post-tracker" style="opacity: 0; height: 1px!important; width: 1px!important; border: 0!important; position: absolute!important; z-index: -1!important;" src="https://syndication.cosmosmagazine.com/?id=198628&amp;title=A+robot+dog+with+a+virtual+spinal+cord+can+learn+to+walk+in+just+one+hour" width="1" height="1" /></p> <div id="contributors"> <p><em><a href="https://cosmosmagazine.com/technology/robot-machine-learning-to-walk/" target="_blank" rel="noopener">This article</a> was originally published on <a href="https://cosmosmagazine.com" target="_blank" rel="noopener">Cosmos Magazine</a> and was written by <a href="https://cosmosmagazine.com/contributor/imma-perfetto" target="_blank" rel="noopener">Imma Perfetto</a>. Imma Perfetto is a science writer at Cosmos. She has a Bachelor of Science with Honours in Science Communication from the University of Adelaide.</em></p> <p><em>Dynamic Locomotion Group (YouTube)</em></p> </div>

Technology

Placeholder Content Image

New “sweaty” living skin for robots might make your skin crawl

<p dir="ltr">A team of Japanese scientists have crafted the first living skin for robots that not only resembles our skin in texture, but it also repels water and has self-healing functions just like ours.</p> <p dir="ltr">To craft the skin, the team submerged a robotic finger into a cylinder filled with collagen and human dermal fibroblasts - the two main components that make up our skin’s connective tissues. The way that this mixture shrank and conformed to the finger that gave it such a realistic appearance - making for a large leap forward in terms of creating human-like appearances for robots.</p> <p><span id="docs-internal-guid-699f2960-7fff-1b2e-d849-c1bc95a796a9">“The finger looks slightly ‘sweaty’ straight out of the culture medium,” <a href="https://www.scimex.org/newsfeed/this-robots-sweaty-living-skin-that-can-heal-might-make-your-skin-crawl" target="_blank" rel="noopener">says</a> Shoji Takeuchi, a professor at the University of Tokyo and the study’s first author. “Since the finger is driven by an electric motor, it is also interesting to hear the clicking sounds of the motor in harmony with a finger that looks just like a real one.”</span></p> <p><img src="https://oversixtydev.blob.core.windows.net/media/2022/06/robot-finger1.jpg" alt="" width="1280" height="720" /></p> <p dir="ltr"><em>The team submerged the robotic finger into a mixture of collagen and human dermal fibroblasts to create the new skin. Image: Shoji Takeuchi</em></p> <p dir="ltr">Realism is a top priority for humanoid robots tasked with interacting with people in healthcare and the service industry, since looking human can improve communication efficiency and even make us like the robot more.</p> <p dir="ltr">Current methods of creating skin for robots use silicone, which effectively mimic human appearance but fall short in creating delicate textures, such as wrinkles, and in having skin-specific functions.</p> <p dir="ltr">Meanwhile, trying to tailor sheets of living skin - commonly used in skin grafting - is difficult when it comes to conforming to fingers, which have uneven surfaces and need to be able to move.</p> <p dir="ltr">“With that method, you have to have the hands of a skilled artisan who can cut and tailor the skin sheets,” Takeuchi says. “To efficiently cover surfaces with skin cells, we established a tissue moulding method to directly mould skin tissue around the robot, which resulted in a seamless skin coverage on a robotic finger.”</p> <p dir="ltr">Other experts have also noted that this level of realism could have the opposite effect, in a phenomenon known as the “uncanny valley” effect.</p> <p dir="ltr">“It is possible that the human-like appearance [of some robots] induces certain expectations but when they do not meet those expectations, they are found eerie or creepy,” Dr Burcu Ürgen, an assistant professor in psychology at Bilkent University, Turkey, who wasn’t involved in the study, told <em><a href="https://www.theguardian.com/science/2022/jun/09/scientists-make-slightly-sweaty-robotic-finger-with-living-skin" target="_blank" rel="noopener">The Guardian</a></em>. </p> <p dir="ltr">Professor Fabian Grabenhorst, a neuroscientist at the University of Oxford who studies the uncanny-valley effect, also told the publication that people might have an initial negative reaction to these kinds of robots, but that it could shift depending on their interactions with the robot.</p> <p dir="ltr">“Initially people might find it weird, but through positive experiences that might help people overcome those feelings,” he told The Guardian.</p> <p dir="ltr">“It seems like a fantastic technological innovation.”</p> <p dir="ltr">As exciting as this discovery is, Takeuchi adds that it’s “just the first step” in covering robots in living skin, with their future work looking to allow the skin to survive without constant nutrient supply and waste removal, as well as including hair follicles, nails, sweat glands and sensory neurons.</p> <p dir="ltr">“I think living skin is the ultimate solution to give robots the look and touch of living creatures since it is exactly the same material that covers animal bodies,” he says.</p> <p dir="ltr">Their study was published in the journal <em><a href="https://doi.org/10.1016/j.matt.2022.05.019" target="_blank" rel="noopener">Matter</a></em>.</p> <p><span id="docs-internal-guid-062b1015-7fff-6c39-2718-c1df1e65a8cd"></span></p> <p dir="ltr"><em>Image: Shoji Takeuchi</em></p>

Technology

Placeholder Content Image

Pompeii’s ancient ruins guarded by a robot “dog”

<p dir="ltr">The Archaeological Park of Pompeii has found a unique way to patrol the historical archaeological areas and structures of Pompeii in Italy. </p> <p dir="ltr">Created by Boston Dynamics, a robot “dog” named Spot is being used to identify structural and safety issues at Pompeii: the ancient Roman city that was encased in volcanic ash following the 79 C.E. eruption of Mount Vesuvius.</p> <p dir="ltr">The robot is the latest addition to a broader initiative to transform Pompeii into a “Smart Archaeological Park” with “intelligent, sustainable and inclusive management.”</p> <p dir="ltr">The movement for this “integrated technological solution” began in 2013, when UNESCO threatened to remove the site from the World Heritage List unless drastic measures were taken to improve its preservation, after structural deficiencies started to emerge. </p> <p dir="ltr">The goal, as noted in the release, is to “improve both the quality of monitoring of the existing areas, and to further our knowledge of the state of progress of the works in areas undergoing recovery or restoration, and thereby to manage the safety of the site, as well as that of workers.”</p> <p dir="ltr">“We wish to test the use of these robots in the underground tunnels that were made by illegal excavators and which we are uncovering in the area around Pompeii, as part of a memorandum of understanding with the Public Prosecutor’s Office of Torre Annunziata,” said Pompeii’s director general Gabriel Zuchtriegel in a statement.</p> <p dir="ltr">In addition to having Spot the “dog” patrol the area, a laser scanner will also fly over the 163-acre site and record data, which will be used to study and plan further interventions to preserve the ancient ruins of Pompeii. </p> <p dir="ltr"><em>Image credits: Getty Images</em></p>

Art

Placeholder Content Image

Archaeologists turn to robots to save Pompeii

<p dir="ltr">The city of Pompeii has experienced not one, but two deathly experiences - first from a volcanic eruption, then from neglect - and technology is now being used to keep it safe going into the future.</p> <p dir="ltr">Decades of neglect, mismanagement and scant maintenance of the popular ruins resulted in the 2010 collapse of a hall where gladiators once trained, nearly costing Pompeii its UNESCO World Heritage status.</p> <p dir="ltr">Despite this, Pompeii is facing a brighter future.</p> <p dir="ltr">The ruins were saved from further degradation due to the Great Pompeii Project, which saw about 105 million euros in European Union funds directed to the site, as long as it was spent promptly and effectively by 2016.</p> <p dir="ltr">Now, the Archaeological Park of Pompeii’s new director is looking to innovative technology to help restore areas of the ruins and reduce the impacts of a new threat: climate change.</p> <p dir="ltr">Archaeologist Gabriel Zuchtriegel, who was appointed director-general of the site in mid-2021, told the Associated Press that technology is essential “in this kind of battle against time”.</p> <p><span id="docs-internal-guid-95bf233a-7fff-da0a-2b03-4e06169e156c">“Some conditions are already changing and we can already measure this,” Zuchtriegel <a href="https://www.nzherald.co.nz/travel/pompeii-rebirth-of-italys-dead-city-that-nearly-died-again/XOOKT34VC3A6ZFG5BJLDC62FJI/" target="_blank" rel="noopener">said</a>.</span></p> <p><img src="https://oversixtydev.blob.core.windows.net/media/2022/02/pompeii1.jpg" alt="" width="1280" height="720" /></p> <p dir="ltr"><em>Archaeologists and scientists are joining forces to preserve and reconstruct artefacts found in Pompeii. Image: Pompeii Archeological Park (Instagram)</em></p> <p dir="ltr">So instead of relying on human eyes to detect signs of climate-caused deterioration on mosaic floors and frescoed walls across the site’s 10,000 excavated rooms, experts will rely on artificial intelligence (AI) and drones. </p> <p dir="ltr">The technology will provide experts with data and images in real-time, and will alert them to “take a closer look and eventually intervene before things happen”, Zuchtriegel said.</p> <p dir="ltr">Not only that, but AI and robots have been used to reassemble frescoes and artefacts that have crumbled into miniscule fragments that are difficult to reconstruct using human hands.</p> <p dir="ltr">“The amphorae, the frescoes, the mosaics are often brought to light fragmented, only partially intact or with many missing parts,” Zuchtriegel <a href="http://pompeiisites.org/comunicati/al-via-il-progetto-repair-la-robotica-e-la-digitalizzazione-al-servizio-dellarcheologia/" target="_blank" rel="noopener">said</a>.</p> <p dir="ltr">“When the number of fragments is very large, with thousands of pieces, manual reconstruction and recognition of the connections between the fragments is almost always impossible or in any case very laborious and slow.</p> <p><span id="docs-internal-guid-32168df9-7fff-f97f-2b16-a0c3c34e40be"></span></p> <p dir="ltr">“This means that various finds lie for a long time in archaeological deposits, without being able to be reconstructed and restored, let alone returned to the attention of the public.”</p> <p dir="ltr"><img src="https://oversixtydev.blob.core.windows.net/media/2022/02/pompeii2.jpg" alt="" width="1280" height="720" /></p> <p dir="ltr"><em>The robot uses mechanical arms and hands to position pieces in the right place. Image: Pompeii Archeological Park (Instagram)</em></p> <p dir="ltr">The “RePAIR” project, an acronym for Reconstructing the past: Artificial Intelligence and Robotics meet Cultural Heritage, has seen scientists from the Italian Institute of Technology create a robot to fix this problem.</p> <p dir="ltr"><span id="docs-internal-guid-2652855f-7fff-1a96-b469-dc8e29ac5886"></span></p> <p dir="ltr">It involves robots scanning the fragments and recognising them through a 3D digitisation system before placing them in the right position using mechanical arms and hands equipped with sensors.</p> <p dir="ltr"><img src="https://oversixtydev.blob.core.windows.net/media/2022/02/pompeii3.jpg" alt="" width="1280" height="720" /></p> <p dir="ltr"><em>The project will focus on frescoes in the House of the Painters at Work, which were shattered during WWII. Image: Pompeii Archeological Park (Instagram)</em></p> <p dir="ltr">One goal is to reconstruct the frescoed ceiling of the House of the Painters at Work, with was shattered by Allied bombing during World War II.</p> <p dir="ltr">The fresco in the Schola Armaturarum - the gladiators’ barracks - will also be the target of robotic repairs, after the weight of excavated sections of the city, rainfall accumulation and poor drainage resulted in the structure collapsing.</p> <p dir="ltr"><span id="docs-internal-guid-6dbfdf37-7fff-432f-0405-800c7e8da418"></span></p> <p dir="ltr"><em>Image: Pompeii Archeological Park (Instagram)</em></p>

Technology

Placeholder Content Image

Artist robot Ai-Da detained in Egypt on suspicion of espionage

<p><span style="font-weight: 400;">A robot with a flair for the arts was detained at the Egyptian border for 10 days ahead of a major exhibition. </span></p> <p><span style="font-weight: 400;">Ai-Da was set to present her artworks at the foot of the pyramids of Giza: the first ever art exhibition held in the historic area. </span></p> <p><span style="font-weight: 400;">The show, titled </span><em><span style="font-weight: 400;">Forever is Now</span></em><span style="font-weight: 400;">, is an annual event organised by </span><span style="font-weight: 400;">Art D’Égypte to support the art and culture scene in Egypt. </span></p> <p><span style="font-weight: 400;">Ai-Da’s digitally created artworks, and her presence at the event, was set to be the highlight of the show. </span></p> <p><span style="font-weight: 400;">However, Egyptian officials grew concerned when she arrived as her eyes feature cameras and an internet modem. </span></p> <p><span style="font-weight: 400;">Because of Ai-Da’s technology, officials at the Egyptian border grew concerned that she had been sent to the country as part of an espionage conspiracy. </span></p> <p><span style="font-weight: 400;">According to </span><a href="https://www.theguardian.com/world/2021/oct/20/egypt-detains-artist-robot-ai-da-before-historic-pyramid-show"><span style="font-weight: 400;">The Guardian</span></a><span style="font-weight: 400;">, British officials had to work intensively to get Ai-Da out of detainment before the beginning of the art show, </span></p> <p><span style="font-weight: 400;">Egyptian officials offered to let Ai-Da free if she had some of her gadgetry removed, to which Aiden Meller, Ai-Da’s creator, refused. </span></p> <p><span style="font-weight: 400;">They offered to remove her eyes as a security measure, but Aiden insisted that she uses her eyes to create her artwork. </span></p> <p><span style="font-weight: 400;">She was eventually released, with her eyes intact, and the show went ahead as scheduled. </span></p> <p><span style="font-weight: 400;">Ai-Da is able to make unique art thanks to specially designed technology developed by researchers at Oxford and Leeds University. </span></p> <p><span style="font-weight: 400;">Ai-Da’s key algorithm converts images she captures with her camera-eyes and converts them to drawings. </span></p> <p><span style="font-weight: 400;">The robot can also paint portraits, as her creators allowed her technology to analyse colours and techniques used by successful human artists. </span></p> <p><em><span style="font-weight: 400;">Image credits: Getty Images</span></em></p>

Art

Placeholder Content Image

Beware the robot bearing gifts

<div> <div class="copy"> <p>In a future filled with robots, those that pretend to be your friend could be more manipulative than those that exert authority, suggests a new study published in <em>Science Robotics.</em></p> <p>As robots become more common in the likes of education, healthcare and security, it is essential to predict what the relationship between humans and robots will be.</p> <div style="position: relative; display: block; max-width: 100%;"> <div style="padding-top: 56.25%;"><iframe src="https://players.brightcove.net/5483960636001/HJH3i8Guf_default/index.html?videoId=6273649735001" allowfullscreen="" allow="encrypted-media" style="position: absolute; top: 0px; right: 0px; bottom: 0px; left: 0px; width: 100%; height: 100%;"></iframe></div> </div> <p class="caption">Overview of authority HRI study conditions, setup, and robot behaviors. Credit: Autonomous Systems and Biomechatronics Lab, University of Toronto.</p> <p>In the <a rel="noreferrer noopener" href="https://www.science.org/doi/10.1126/scirobotics.abd5186?_ga=2.192393706.1796540797.1632092915-1153018146.1604894082" target="_blank">study</a>, led by Shane Saunderson and Goldie Nejat of the University of Toronto, Canada, researchers programmed a robot called Pepper to influence humans completing attention and memory tasks, by acting either as a friend or an authority figure.</p> <p>They found that people were more comfortable with, and more persuaded by, friendly Pepper.</p> <p>Authoritative Pepper was described by participants as “inhuman,” “creepy,” and giving off an “uncanny valley vibe”.</p> <p>“As it stands, the public has little available education or general awareness of the persuasive potential of social robots, and yet institutions such as banks or restaurants can use them in financially charged situations, without any oversight and only minimal direction from the field,” writes James Young, a computer scientist  from the University of Manitoba, Canada, in a related <a rel="noreferrer noopener" href="http://10.1126/scirobotics.abk3479" target="_blank">Focus</a>.</p> <p>“Although the clumsy and error-prone social robots of today seem a far cry from this dystopian portrayal, Saunderson and Nejat demonstrate how easily a social robot can leverage rudimentary knowledge of human psychology to shape their persuasiveness.”</p> <p class="has-text-align-center"><strong><em>Read more: <a rel="noreferrer noopener" href="https://cosmosmagazine.com/technology/robotics/meet-the-robots-representing-australia-at-the-robot-olympics/" target="_blank">Meet the robots representing Australia at the ‘robot Olympics’</a></em></strong></p> <p>To test a robot’s powers of persuasion, Pepper assumed two personas: one was as a friend who gave rewards, and the other was as an authoritative figure who dealt out punishment.</p> <p>A group of participants were each given $10 and told that the amount of money could increase or decrease, depending on their performance in set memory tasks.</p> <p>Friendly Pepper gave money for correct responses, and authoritative Pepper docked $10 for incorrect responses.</p> <p>The participants then completed tasks in the <a rel="noreferrer noopener" href="https://www.pearsonclinical.co.uk/Psychology/AdultCognitionNeuropsychologyandLanguage/AdultAttentionExecutiveFunction/TestofEverydayAttention(TEA)/TestofEverydayAttention(TEA).aspx" target="_blank">Test of Everyday Attention</a> toolkit, a cognition test based on real-life scenarios.</p> <p>After the participant made an initial guess, Pepper offered them an alternative suggestion – this was always the right answer. The participant could then choose to listen to Pepper or go with his or her original answer.</p> <p>The results showed that people were more willing to switch to friendly Pepper’s suggestions than those of authoritative Pepper.</p> <p><em>Image credit: Shutterstock</em></p> <p><em>This article was originally published on <a rel="noopener" href="https://cosmosmagazine.com/technology/robotics/beware-the-robot-bearing-gifts/" target="_blank">cosmosmagazine.com</a> and was written by Deborah Devis.</em></p> </div> </div>

Technology

Placeholder Content Image

Caves in northern Greece are being showcased by a robot tour guide

<p><span style="font-weight: 400;">A new tour guide in Greece is attracting tourists from all over the world, but for a very unusual reason. </span></p> <p><span style="font-weight: 400;">Persephone has been welcoming tourists to the Alistrati Cave in northern Greece since mid-July, but not all of the visitors are coming to see the caves. </span></p> <p><span style="font-weight: 400;">Persephone is the world’s first robot tour guide inside a cave, which covers the first 150 metres of the tour that is open to the public, before a human guide takes over. </span></p> <p><span style="font-weight: 400;">The robot can give its part of the tour in 33 languages and interact with visitors at a basic level in three languages. </span></p> <p><span style="font-weight: 400;">It can also answer most questions, but only in the Greek language. </span></p> <p><span style="font-weight: 400;">The robot’s name comes from an ancient Greek myth, where it was said that in a nearby plain that Pluto — the god of the underworld who was also known as Hades — abducted Persephone, with the consent of her father Zeus, to take her as his wife.</span></p> <p><span style="font-weight: 400;">Nikos Kartalis, the scientific director for the Alistrati site, said the idea of creating a robot guide came to him when he saw one on TV guiding visitors at an art gallery.</span></p> <p><span style="font-weight: 400;">Nikos said the robot finally became a reality after getting funding, with the build of the machine costing AUD$139,000.</span></p> <p><span style="font-weight: 400;">"We already have a 70 per cent increase in visitors compared to last year since we started using" the robot, says Kartalis.</span></p> <p><span style="font-weight: 400;">"People are enthusiastic, especially the children, and people who had visited in the past are coming back to see the robot guide."</span></p> <p><span style="font-weight: 400;">"It is something unprecedented for them, to have the ability to interact with their robot by asking it questions and the robot answering them," he said.</span></p> <p><span style="font-weight: 400;">The caves have been a regular tourist spot since they opened to visitors in 1998, with people coming from all over the world to explore the three million year old site.</span></p> <p><em><span style="font-weight: 400;">Image credit: YouTube</span></em></p>

Travel Trouble

Placeholder Content Image

Tesla unveils new humanoid robot at an awkward event

<p><span style="font-weight: 400;">Tesla CEO and billionaire Elon Musk has confused people with his latest tech product launch. </span></p> <p><span style="font-weight: 400;">At Tesla’s AI Day event, Musk announced his new humanoid “Tesla bot”, which prompted one analyst to call the project a “head-scratcher that will further agitate investors.”</span></p> <p><span style="font-weight: 400;">The entrepreneur said a 172cm, 56kg prototype robot could be ready as soon as next year. </span></p> <p><span style="font-weight: 400;">Instead of waiting until a prototype was ready for the launch, Musk brought out a man in a  latex bodysuit that was created to look like the robot’s design. </span></p> <p><span style="font-weight: 400;">In a bizarre twist, when the “robot” came on stage, they broke out in a dance routine lasting one minute before Musk took to the stage. </span></p> <p><span style="font-weight: 400;">Musk didn’t give many details on the Tesla bot, but insisted it will have a “profound” impact on the economy by driving down labour costs. </span></p> <p><span style="font-weight: 400;">“But not right now because this robot doesn’t work,” Musk noted, nonetheless insisting that, “In the future, physical work will be a choice.”</span></p> <p><span style="font-weight: 400;">“Talk to it and say, ‘please pick up that bolt and attach it to a car with that wrench,’ and it should be able to do that,” Musk said. </span></p> <p><span style="font-weight: 400;">“‘Please go to the store and get me the following groceries.’ That kind of thing. I think we can do that.”</span></p> <p><span style="font-weight: 400;">Musk says that the robot’s primary purpose will be to complete tasks that are “boring, repetitive and dangerous”, giving more free time to individuals who can afford the robot.</span></p> <p><span style="font-weight: 400;">After onlookers raised concerns, Musk said the robot will be designed so that humans can easily run away from or overpower it if needed. </span></p> <p><span style="font-weight: 400;">The Tesla CEO said the robot, which has been named Optimus, will run off the same chips and sensors as Tesla’s so-called Autopilot software, which has faced intense backlash from federal regulators and politicians. </span></p> <p><span style="font-weight: 400;">Twitter users reacted to the news of the Tesla bot with an abundance of memes, saying the idea seemed to be straight out of a movie that does not end well for humankind. </span></p> <p><span style="font-weight: 400;">Check out the unusual “prototype” unveiling below:</span></p> <p><iframe width="560" height="315" src="https://www.youtube.com/embed/TsNc4nEX3c4" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe></p> <p><em><span style="font-weight: 400;">Image credits: Getty Images/Youtube</span></em></p>

Technology

Placeholder Content Image

Robots and drones: The new age of toys

<p>I’m a geek. And as a geek, I love my tech toys. But over time I’ve noticed toys are becoming harder to understand.</p> <p>Some modern toys resemble advanced devices. There are flying toys, walking toys, and roving toys. A number of these require “configuring” or “connecting”.</p> <p>The line between toy, gadget and professional device is blurrier than ever, as manufacturers churn out products including <a href="https://www.t3.com/features/best-kids-drones">drones for kids</a> and <a href="https://www.amazon.com/Hidden-Spy-Nanny-Camera-Wi-fi/dp/B07P7BCYZT">plush toys with hidden nanny cams</a>.</p> <p>With such a variety of sophisticated, and sometimes over-engineered products, it’s clear manufacturers have upped their game.</p> <p>But why is this happening?</p> <p><strong>The price of tech</strong></p> <p>Toys these days seem to be designed with two major components in mind. It’s all about the smarts and rapid manufacture.</p> <p>In modern toys, we see a considerable level of programmed intelligence. This can be used to control the toy’s actions, or have it respond to input to provide real time feedback and interaction – making it appear “smarter”.</p> <p>This is all made possible by the falling price of technology.</p> <p>Once upon a time, placing a microcontroller (a single chip microprocessor) inside a toy was simply uneconomical.</p> <p>These days, they’ll <a href="https://au.rs-online.com/web/c/semiconductors/processors-microcontrollers/microcontrollers/">only set you back a few dollars</a> and allow significant computing power.</p> <p>Microcontrollers are often WiFi and Bluetooth enabled, too. This allows “connected” toys to access a wide range of internet services, or be controlled by a smartphone.</p> <p>Another boon for toy manufacturers has been the rise of prototype technologies, including 3D modelling, 3D printing, and low cost CNC (computer numerical control) milling.</p> <p>These technologies allow the advanced modelling of toys, which can help design them to be “tougher”.</p> <p>They also allow manufacturers to move beyond simple (outer) case designs and towards advanced multi-material devices, where the case of the toy forms an active part of the toy’s function.</p> <p>Examples of this include hand grips (found on console controls and toys including Nerf Blasters), advanced surface textures, and internal structures which support shock absorption to protect internal components, such as wheel suspensions in toy cars.</p> <p><strong>Bot helpers and robot dogs</strong></p> <p>Many recent advancements in toys are there to appease our admiration of automatons, or self operating machines.</p> <p>The idea that an inanimate object is transcending its static world, or is “thinking”, is one of the magical elements that prompts us to attach emotions to toys.</p> <p>And manufacturers know this, with some toys designed specifically to drive emotional attachment. My favourite example of this is roaming robots, such as the artificially intelligent <a href="https://www.anki.com/en-us/vector.html">Anki Vector</a>.</p> <p>With sensors and internet connectivity, the Vector drives around and interacts with its environment, as well as you. It’s even <a href="https://www.amazon.com/Vector-Robot-Anki-Hangs-Helps/dp/B07G3ZNK4Y">integrated with Amazon Alexa</a>.</p> <p>Another sophisticated toy is Sony’s Aibo. This robot pet shows how advanced robotics, microelectronics, actuators (which allow movement), sensors, and programming can be used to create a unique toy experience with emotional investment.</p> <p><span class="attribution"><a href="https://www.shutterstock.com/image-photo/ho-chi-minh-city-vietnam-apr-1095006827" class="source"></a></span><strong>Screens not included</strong></p> <p>Toy manufacturers are also leveraging the rise of smartphones and portable computing.</p> <p>Quadcopters (or drones) and other similar devices often don’t need to include their own display in the remote control, as video can be beamed to an attached device.</p> <p>Some toys even use smartphones as the only control interface (used to control the toy), usually via an app, saving manufacturers from having to provide what is arguably the most expensive part of the toy.</p> <p>This means a smartphone becomes an inherent requirement, without which the toy can’t be used.</p> <p>It would be incredibly disappointing to buy a cool, new toy - only to realise you don’t own the very expensive device required to use it.</p> <p><strong>My toys aren’t spying on me, surely?</strong></p> <p>While spying may be the last thing you consider when buying a toy, there have been several reports of talking dolls <a href="https://www.npr.org/sections/alltechconsidered/2016/12/20/506208146/this-doll-may-be-recording-what-children-say-privacy-groups-charge">recording in-home conversations</a>.</p> <p>There are similar concerns with smart-home assistants such as Amazon Alexa, Google Assistant and Apple’s Siri, which store <a href="https://www.politifact.com/truth-o-meter/statements/2018/may/31/ro-khanna/your-amazon-alexa-spying-you/">your voice recordings in the cloud</a>.</p> <p>These concerns might also be warranted with toys such as the Vector, and Aibo.</p> <p>In fact, anything that has a microphone, camera or wireless connectivity can be considered a privacy concern.</p> <p><strong>Toys of the future</strong></p> <p>We’ve established toys are becoming more sophisticated, but does that mean they’re getting better?</p> <p><a href="https://www.gartner.com/smarterwithgartner/gartner-top-10-strategic-technology-trends-for-2020/">Various</a> <a href="https://www.accenture.com/us-en/insights/technology/technology-trends-2019">reports</a> indicate in 2020, artificial intelligence (AI) and machine learning will continue to be pervasive in our lives.</p> <p>This means buying toys could become an even trickier task than it currently is. There are some factors shoppers can consider.</p> <p>On the top of my list of concerns is the type and number of batteries a toy requires, and how to charge them.</p> <p>If a device has <a href="https://theconversation.com/nearly-all-your-devices-run-on-lithium-batteries-heres-a-nobel-prizewinner-on-his-part-in-their-invention-and-their-future-126197">in-built lithium batteries</a>, can they be easily replaced? And if the toy is designed for outdoors, <a href="https://theconversation.com/why-batteries-have-started-catching-fire-so-often-68602">can it cope with the heat?</a> Most lithium-ion batteries degrade quickly in hot environments.</p> <p>And does the device require an additional screen or smartphone?</p> <p>It’s also worth being wary of what personal details are required to sign-up for a service associated with a toy - and if the toy can still function if its manufacturer should cease to exist, or the company should go bust.</p> <p>And, as always, if you’re considering an advanced, “connected” toy, make sure to prioritise your security and privacy.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img style="border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important; text-shadow: none !important;" src="https://counter.theconversation.com/content/127503/count.gif?distributor=republish-lightbox-basic" alt="The Conversation" width="1" height="1" /><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: http://theconversation.com/republishing-guidelines --></p> <p><em><a href="https://theconversation.com/profiles/andrew-maxwell-561482">Andrew Maxwell</a>, Senior Lecturer, <a href="http://theconversation.com/institutions/university-of-southern-queensland-1069">University of Southern Queensland</a></em></p> <p><em>This article is republished from <a href="http://theconversation.com">The Conversation</a> under a Creative Commons license. Read the <a href="https://theconversation.com/robots-ai-and-drones-when-did-toys-turn-into-rocket-science-127503">original article</a>.</em></p>

Technology

Placeholder Content Image

A robot is currently writing a book with a human

<p><em><strong>Leah Henrickson is a PhD candidate at Loughborough University’s School of the Arts, English and Drama.</strong></em></p> <p>In Isaac Asimov’s <em>I, Robot</em>, a collection of nine short stories about robotics, Asimov explores the possibilities of human-computer interaction. How can humans and computers co-exist? How can they work together to make a better world?</p> <p>A research group from the <span style="text-decoration: underline;"><strong><a href="https://www.meertens.knaw.nl/cms/en/" target="_blank">Meertens Instituut</a></strong></span> in Amsterdam and the <span style="text-decoration: underline;"><strong><a href="https://www.uantwerpen.be/en/rg/digitalhumanities" target="_blank">Antwerp Centre for Digital Humanities and Literary Criticism</a></strong></span> recently introduced a new digital creative writing system. Using a graphical interface, an author drafts a text sentence by sentence. Then, the system proposes its own sentences to continue the story. The human and the computer work together to create what the system’s developers call “synthetic literature”.</p> <p>The <span style="text-decoration: underline;"><strong><a href="http://aclweb.org/anthology/W17-3904" target="_blank">paper</a></strong></span> detailing this project describes the text generation system as an attempt to:</p> <p><em>Create a stimulating environment that fosters co-creation: ideally, the machine should output valuable suggestions, to which the author retains a significant stake within the creative process.</em></p> <p><strong>How to train your robot</strong></p> <p>To learn language and sentence structure, the system has been trained using the texts of 10,000 Dutch-language e-books. Additionally, the system was trained to mimic the literary styles of such renowned authors as Asimov and Dutch science fiction author <span style="text-decoration: underline;"><strong><a href="http://www.imdb.com/name/nm0320554/" target="_blank">Ronald Giphart</a></strong></span> by generating sentences that use similar words, phrases, and sentence structures as these authors.</p> <p>As part of this year’s annual <span style="text-decoration: underline;"><strong><a href="https://www.nederlandleest.nl/" target="_blank">Nederland Leest</a></strong></span> (The Netherlands Reads) festival, Giphart has been trialling the co-creative writing system to write a tenth <em>I, Robot</em> story. Once Giphart’s story is completed it will be published at the end of a new Dutch edition of Asimov’s classic text. Throughout November, participating libraries throughout the Netherlands will be offering free copies of this edition to visitors to get people thinking about this year’s festival theme: Nederland Leest de Toekomst (The Netherlands Reads the Future).</p> <p>As Giphart types new sentences into the system’s graphical interface, the system responds by generating a selection of sentences that could be used to continue the story. Giphart can select any of these sentences, or ignore the system’s recommendations altogether.</p> <p>The point of the system, its developers explain, is to “provoke the human writer in the process of writing”. Giphart says he <span style="text-decoration: underline;"><strong><a href="https://www.ad.nl/wetenschap/ronald-giphart-experimenteert-met-literaire-robot~a5a3cb9f" target="_blank">still considers himself</a></strong></span> “the boss, but [the system] does the work”. One <span style="text-decoration: underline;"><strong><a href="http://www.welingelichtekringen.nl/tech/699192/ronald-giphart-gaat-boek-schrijven-met-robot-ik-ben-de-baas-maar-hij-doet-het-werk.html" target="_blank">article</a></strong></span> even described the system as being ideal “for those who have literary aspirations, but who lack talent”.</p> <p><strong>Can a computer be creative?</strong></p> <p>The “synthetic literature” referred to by this system’s developers implies a combined production effort of both human and computer. Of course, the human still guides production. As co-developer Folgert Karsdorp <span style="text-decoration: underline;"><strong><a href="https://www.ad.nl/wetenschap/ronald-giphart-experimenteert-met-literaire-robot~a5a3cb9f" target="_blank">explained</a></strong></span>: “You have numerous buttons to make your own mix. If you want to mix Giphart and Asimov, you can do that too.” The system follows its user’s direction, responding by using its own capacity for creativity.</p> <p>But can a computer ever be truly creative? This is a question that the field of computational creativity has been studying since computers were invented. The field generally accepts that a computer can be called creative if its output would be considered creative had it been produced by a human.</p> <p>Computational creativity debates are all rooted in one underlying question: is the computer merely a tool for human creativity, or could it be considered a creative agent itself? In a discussion about computer-generated art, creativity scholar Margaret Boden <span style="text-decoration: underline;"><strong><a href="https://books.google.co.cr/books?id=MSoUDAAAQBAJ&amp;lpg=PP1&amp;pg=PA185#v=onepage&amp;q&amp;f=false" target="_blank">noted</a></strong></span> that:</p> <p><em>It is the computer artist [the developer] who decides what input a system will respond to, how the system will respond, how unpredictable the system’s output will be, and how transparent the system’s functionality will be to users.</em></p> <p>Even the most unpredictable output, according to Boden, results from choices the computer artist has made. While a developer may not be able to predict a system’s exact output, the output nevertheless reflects the choices the developer has made while programming.</p> <p>The co-creative writing system Giphart is using isn’t able to produce an entire book by itself, but it can produce paragraphs that continue Giphart’s story for him. Giphart, though, ultimately has the power to choose what computer output he uses.</p> <p>But does this mean that Giphart alone will be credited as the author of his Ik, robot story, or will his computer be given credit as a co-author? It’s still unclear. Although it could be hotly debated whether the creative writing system is just a tool for Giphart’s vision or could be considered an agent itself, we won’t be seeing the demise of human authors any time soon.</p> <p>One Nederland Leest blog <span style="text-decoration: underline;"><strong><a href="https://www.nederlandleest.nl/mens-en-machine-schrijven-samen-literair-verhaal" target="_blank">post</a></strong></span> compares this new method of writing to the evolution of the electric guitar. It may have existed for nearly a century, but it wasn’t until Jimi Hendrix showed us how to really play the instrument that its potential was realised. Similarly, we still need to discover how to “play” this writing system to get the best results, whatever they might be.</p> <p>So is synthetic literature the future? Maybe. Keep reading to find out.</p> <p><em>Written by Leah Henrickson. Republished with permission of <a href="http://theconversation.com/" target="_blank"><strong><span style="text-decoration: underline;">The Conversation</span></strong></a>.<img width="1" height="1" src="https://counter.theconversation.com/content/84932/count.gif?distributor=republish-lightbox-advanced" alt="The Conversation"/> </em></p>

Books

Placeholder Content Image

Would you trust a robot with your life?

<p>First, let's put the Terminator to bed, sort of. </p> <p>Dr Simon Kos is the chief medical officer of Microsoft's $10b plus worldwide health unit. He was in Rotorua recently as the keynote speaker at the Health Informatics NZ Conference, and in the refined environment of the Millenium Hotel Club Room, we're discussing the staggering advances in Artificial Intelligence (AI).</p> <p>Stephen Hawking recently claimed AI could be the end of mankind. Could the technology go rogue?</p> <p>"The short answer is yes," said Kos.</p> <p>"It makes a great plot for a sci-fi movie and the reason I say yes is because in the early experimentation of some of this AI we had some of our own direct learnings."</p> <p>Kos is talking about Microsoft's AI 'chatbot' Tay, which after being linked to Twitter did indeed go rogue.</p> <p>He described Tay as a "garbage in garbage out" AI, and when the inputted 'garbage' included sexist and racist content, "soon Tay was starting to respond with this sort of material."</p> <p>Now the good news.</p> <p>"Extend that to the Terminator 'Judgement Day' scenario. Would you have an AI agent with the keys to nuclear weapons? (You) probably wouldn't."</p> <p>Kos said the Tay experience proved the need for AI to have human checks and balances, something he said is quite a simple process.</p> <p>"(It is) the junction between a future of computers going rogue and how we can shape the future we want."</p> <p>Kos began his career as a frontline doctor in Australia in the 1990s, working in intensive care and emergency units. His decision to join Microsoft's health unit in 2010 was prompted by the growing digitisation of the medical world.</p> <p>The second prompt can be traced back as far as the Third Century BC and the Hippocratic Oath instruction 'first do no harm'.</p> <p>"If I projected out my clinical career I would have caused patient harm by not having the right information to make the right decisions," he said.</p> <p>"I thought who's fixing that?"</p> <p>One of Microsoft's current fixes is the AI Project Hanover. Kos said this was essentially a search system on steroids, using natural language processing to enable medical researchers to hone in on all the required research on a subject - he uses the example of a specific cancer gene.</p> <p>"If you want to stay current in your medical literature, you should be reading about 28 hours a day. This is  the conundrum modern clinicians face," he said.</p> <p>Kos is optimistic about not just about the future, but now.</p> <p>He said we already have robots that are conducting prostate operations with a far higher level of success than human surgeons, Project Hanover is speeding up research and a US project using AI to speed up retinal exams for diabetics has seen compliance rates for the yearly checks increase from between 33 and 50 per cent to more than 80 per cent.</p> <p>Virtual reality and hologram technology is coming on in leaps and bounds too, he said. The surgeon in the US operating on the patient in Japan? "We're almost there."</p> <p>There's a frustration too though.</p> <p>Kos said the digitisation of medical information was creating an ever-expanding source of untapped knowledge.</p> <p>"We don't have permission to break open that black box, look at the information and use it, but that's the gold mine sitting there. Your information could be the next breakthrough that leads us to the cure for cancer."</p> <p>Kos said Microsoft was already "aggressively" lobbying European governments to get access to that gold mine, reassuring that proper privacy checks would be in place.</p> <p>He's also slightly frustrated by what he refers to as the lag time - the gap between medical best practice being ascertained and going into frontline practice.</p> <p>"X-rays on pregnant ladies. That persisted long after the evidence was out. My hope is that with AI reducing the latency between when we've got the findings and when it comes into clinical practice, we can start to get better patient outcomes."</p> <p>He said the recent WannaCry cyber attack on the British National Health Service shouldn't dampen enthusiasm for technology use, but rather act as a wake-up call to keep systems updated.</p> <p>He's aware of a the darker side of the tech space though, even referring to cases where building management systems have been hijacked and threats issued that hackers would turn off operating theatre lights mid-procedure.</p> <p>It's big business, he said, with a black market existing for medical data.</p> <p>"Nation state actors and national espionage, we absolutely see that. We track that through our digital crimes unit. I probably can't say much more than that," he said.</p> <p>Overall however, Kos sees the benefits of the technology as far outweighing any risks, provided the human element remains.</p> <p>"I very much see it as human augmentation rather than human replacement," he said.</p> <p>"We're far from being redundant."</p> <p>What are your thoughts?</p> <p><em>Written by Benn Bathgate. Republished with permission of <a href="http://www.stuff.co.nz/" target="_blank"><strong><span style="text-decoration: underline;">Stuff.co.nz</span></strong></a>. </em></p>

Technology

Placeholder Content Image

This tiny robot is helping people with dementia

<p>Meet “Matilda”. She might look like a futuristic kids’ toy, but in reality, she could be the secret to improving the quality of life for thousands of dementia patients. The tiny, adorable “social robot” has been programmed to recognise human voices and facial expressions, dance, play music and call out bingo numbers.</p> <p>Developed by NEC Japan in conjunction with La Trobe University’s Research Centre for Computers, Communication and Social Innovation, Matilda was put to test among 115 aged care patients between 2010 and 2013 and assessed for her ability to engage with those living with dementia. The Australian trial was an overwhelming success, with just two per cent of users concerned with the robot’s presence.</p> <p>Not only is Matilda providing dementia sufferers with some much-needed company, but her ability to issue reminders throughout the day, read the day’s news and weather forecast and even make phone calls over Skype is giving patients back some independence.</p> <p>“The findings of this study indicate there is a statistically significant improvement in emotional, visual, and behavioural engagement of older people with social robots over the years,” lead researcher Professor Rajiv Khosla writes. “The post-trial survey has also verified their acceptance in the interaction with social robots.”</p> <p>Just under 90 per cent of the patients involved in the trial said they enjoyed seeing Matilda dance, 75 per cent felt relaxed talking to her, and a very promising 88 per cent said the robot made them feel better.</p> <p>“The results implicate that by socially engaging older people with meaningful activities provided and [mediated] by Matilda, we are able to break technology barriers and encourage acceptance of Matilda amongst the older residents.”</p> <p>Here’s hoping we see more of this kind of technological innovation in aged care facilities! Tell us in the comments below, are you one of the lucky few who got to witness Matilda interacting with a loved one?</p> <p><em>Image: La Trobe</em></p>

Caring