The search for robots does not always end with finding discrete autonomous actors. The picture is more complex. In March, I travelled 7 hours West to visit the Open Day at Rio Tinto’s copper mine at Northparkes NSW (near Parkes) on March 2 2013. This visit was an opportunity to experience and understand some of the robots and other actors around the robots. These actors smoothed and accelerated the movement of ore from 600m underground to the surface, onto stockpiles, and finally on train to the ports.
Among the robotic actors on this site was the Loader Hauler Dumper (LHD) from Swedish manufacturer Sandvik. This vehicle is a hybrid operated / autonomous / remote operated rock mover. It is a robot that navigates its load from draw points deep underground, and brings it nearer the surface for crushing and refining.
As it turns out, the LHD is only one among many kinds of robot actor, changing the mine’s technological shape. Mining is slowly changing from being a series of discrete tasks by different actors. Each smoothing works towards turning the mine into a continuous process with greater ongoing measurement and control, in the name of efficiency (continuous mining is on the long-term agenda for many miners).
The story is not that the mine contains robots — it is the whole mine itself that is becoming robotic. More and more components afford remote sensing, feedback and continuous control. Surveillant components (cameras, sensors, robot mounted cameras and so on), offer miners various kinds of agency that bring into play more consistency in managing control flows. The flow of crushed copper ore takes the ore to the stockpile measures the ore as it passes on a weightometer.
The conveyor belt takes crushed ore to the stockpile
Beyond the Northparkes itself, Rio Tinto has an eye to the future. In the Pilbara WA, it has introduced the enormous robotic dump trucks, autonomous drills, and soon autonomous trains. The robotic mine of the future is being built one component at a time, motivated by deeper ambitions of efficiency and control. For now, miners’ bodies and minds remain the dominant actors in most mining practices. The inspiration for efficiency in the ‘Mine of the Future’ operates as a present guiding vision as both internal mantra and PR rhetoric.
The vision of a mine without humans on site is, perhaps, compelling for many. Certainly it allows management to control. Many workers prefer the conditions of remote operations and control centres. Some external observers see the value of this change. Human bodies are clearly outside their element when digging up elements. Bodies are inherently vulnerable in underground environments, and in the presence of massive machinery and explosives. Safety is the mine’s ubiquitous guiding force. Miners’ flouro jackets and safety helmets are a uniform for those avoiding risk.
The body’s capacities to complete tasks repeatedly, and precisely, are also limited, in comparison with many emerging devices. However, the introduction of new devices is quite uneven. On the site, hi-tech gear sits alongside traditional tools. The mine uses up-to-date monitoring systems alongside a tag board at the mine entrance. Each miner must post their tag onto the appropriate spot on the Surface Tag Board when they go underground. Until all the tags are accounted for, there will be no blasting.
Safety serves a double role, imposing control over risky situations, and justifying greater control over miners’ actions. At one level, mining control regimes are undoubtedly justified by the high level of risk. An accident in this mine in 2003 killed four workers (Hebblewhite 2003). On the other hand, Danger is management’s collaborator, justifying tighter control over the workforce. The logic of the safety / surveillance pair is gradually bringing to mine sites a regime of control (Deleuze 1992). Remote systems, feedback, and constant training of workers is less the mode of surveillance from outside, and more control over thresholds of movement.
The risk of deviance is tripled when the possibility of surveillance, the actual risky environment, and the technologies placing the worker under control combine. Control displaces and reconfigures the labouring body for as long as it takes to remove the bodies from risks. When explosives are involved there is no option but to remove the workers bodies from the the location.
Dynamite is a 150 year-old technology that introduced non-human force of explosives to reduce hand-digging. The technique of block cave mining used at NorthParkes is an efficient (but not particularly safe) technique that uses explosives to create massive rockfalls underground. These funnel the fractured ore into draw-points, leaving the ore exposed, but in the dangerous location under rockfall.
The showcase of the site is the Sandvik Automine LH514ELHD: a bright orange vehicle with a large scoop at the front. The vehicle can be controlled remotely from the surface. It also features laser scanners and intelligence that allow it to take control of the vehicle to follow a trail towards the surface. This remote-controlled and autonomous system was considered a trail-blazing implementation in 2010. These new technologies remove operators people from the most dangerous places, and returns them to a more controlled environment.
Becoming the load in a Sandvik Loader Hauler Digger (LHD).
The robotic components: laser scanners guide this robot vehicle by following dead-reckoning tags deep underground, guiding the LHD.
This installation at Northparkes is strategically important for the Swedish company that produces this vehicle. The Australian mining environment is dominated by Caterpillar. It is also part of rapid changes in mining that are withdrawing human bodies, and into control rooms. These are changing the profile of workers, and possibly jettisoning those who don’t have the right profile of expertise.
Rio Tinto’s open day is itself a form of smoothing, building relationships, and removing the potential obstructions in public opinion or expectations of potential employees. Rio Tinto is very active in controlling perceptions of the company. They produce an array of reports, websites, media releases and videos. For example, ‘The Miracle of Copper‘ offers an award-winning, company friendly account of the processes of copper mining. Using the latest vehicles: LHDs and open days; training and public videos; websites and conference presentations — Rio is communicating the many of the values of Rio’s ‘Mine of the Future’. The company has extended their regimes of control away from disciplined secrecy (such as in Ok Tedi) and towards smoothed operations of PR and automation.
Deleuze, G. (1992). Postscript on the Societies of Control. October, 59, 3–7. Hebblewhite, B. K. (2003). Northparkes Findings – The implications for geotechnical professionals in the mining industry, 1–8. (see links)
Rio Tinto (2010) ‘Ore processing’ Northparkes website http://www.northparkes.com.au/ore_processing.aspx
Here is the abstract for my paper at the 2012 Cultural Studies Association of Australia conference. I presented it on December 5, 2012.
Mining automation, displaced labour and materialities of communication
Digital Cultures, University of Sydney
Information systems, remote operation and robotics are currently being introduced into mines around the world. As miners reconfigure communication, control and labour, mining practices that have barely changed in a century are being transformed. This paper analyses innovations such as remote operation of mining, and autonomous systems as media changes, as well as changes in labour processes. The paper follows in reverse the historical arc of Harold Innis, who began in geographical economics (cod, fur, railways in Canada) before pioneering a materialist, longue durée historical media theory.
Mining is among the most basic material human practices. The blasting, loading, hauling, processing and shipping of iron ore is a rudimentary process performed on a huge scale. Digital systems don’t immediately change these material practices, but introduce new information and control flows. The autonomous Komatsu trucks now hauling ore in the Pilbara are little different physically from the human-driven fleet, but afford a precision, continuity, and smoothness of operation that human drivers could not tolerate. Digital media are valued in mining for their greater ‘efficiencies’, and their centralising and visualisation of monitoring and control of mine sites, which can be thousands of kilometres apart. These changes in machine/material communications and autonomy have implications for the kinds of work, the kinds of workers, and the kinds of communities that can cooperate with the mines, and many other workplaces, of the future.
Notes for Chris Chesher on ABC Northwest (Karratha)
September 3, 2012.
At 1030am I talked with Cristy-Lee Macqueen from ABC Northwest.
Mine sites are changing, as robotic technologies are taking on communication and control roles previously held by people. These changes have been coming for some time, but there has recently been a shift from trialling autonomous systems towards using them in production.
In 2008 the first autonomous trucks were first introduced experimentally, carrying waste products at Rio Tinto’s West Angelas mine. The trials seem to have been a success, as the five Komatsu autonomous trucks covered 570,000 kilometres over 897 days at work between them until February this year.
The old model: Komatsu 830 with human drivers.
In late 2011, the autonomous trucks were reassigned, entering the iron ore production process along with five new trucks, hauling ore at the Junction South East pit of Rio’s Yandicoogina mine.
These ten trucks will undoubtedly be joined by more new autonomous trucks. Rio Tinto reached an understanding with Komatsu in Novermber 2011 to buy 150 Komatsu Autonomous Haulage System trucks over the following four years. It’s not clear what the impact of the iron ore price slump will be on these acquisitions, or how they will fit into Rio’s overall processes.
Komatsu documents that these imposing trucks are fitted with a range of sensors that allow them to operate very safely and accurately. They use laser, radar, GPS, and communications systems to help follow a digital map of the mine site with a lot of precision. The trucks are coordinated by Rio’s control centre 1500 km away, in Perth.
In addition to these developments, Rio has committed over $400 million to automating trains over the next few years. Other parts of the mining process, such as drills, are being automated, or being tagged with location beacons.
Safety is one of the motivations for introducing autonomous systems. A driverless vehicle can’t injure the driver. Autonomous systems don’t have lapses in attention, or drive erratically.
Another reason is to increase production efficiency. Autonomous trucks don’t take breaks. They don’t need to work in shifts. Together, these autonomous systems can work towards the goal of continuous production, where the mine produces an uninterrupted stream of ore.
I’m an academic at the University of Sydney. I am here in Karratha trying to get a sense of how people in the Pibarra feel about the changes to mining work as mining automation is introduced. I’d appreciate if anyone with experience or opinions about mine automation to call in. I’m recording this program, and I’d like to use the transcript in my research. You can find more about my project on my blog http://followingrobots.wordpress.com
Whether these goals of safety and efficiency are achieved, it seems likely there will be changes to the experience of mining. It may affect the social life of mining towns.
To bring up a very different example, when mobile phones became available, they seemed at first to be just a phone you could carry around. In fact, they were quite different from fixed phones. They allowed people to change the way they organised their lives. Rather than make detailed arrangements ahead of time, people with mobiles could easily change plans at the last minute. With smart phones, people could make images and change them, making their own media.
Of course, an automated mine is very different from a community of mobile users. The control centre (opened in 2010) gathers detailed information across several mine sites, centralises control, and provides a place for collective expert decision-making. Remote operation allows operators to take over some stages of production, and allows a small number of people to control many machines. The mine site increasingly becomes a rationalised, controlled and regulated rock factory.
Advocates point to potential benefits of automation for workers. It can take away dangerous, dull and dirty work that nobody wants to do. Mine automation may reduce risks of injury and death. By reducing workers on site, it may reduce fly-in-fly-out work, allowing expert operators to work in urban control rooms. This may take social and economic pressure away from remote mining communities. See also BAEconomics Report.
But there are some potential draw-backs: some people may lose their jobs to autonomous systems, and these changes may raise industrial pressures. The high degree of control over mine sites may be extended to new expectations for those working alongside autonomous systems. The dependence on planned communications systems and GPS guided technology may bring some fragility to autonomous operations, in comparison to the more resilient and adaptable human operated systems.
The long term implications of large scale use of autonomous systems are yet to be revealed. As WA will soon host the largest fleet of autonomous mining vehicles in the world, the unanticipated implications, and the qualitative shifts in mining practices, are likely to play out here.
If you have experience or opinions about mining automation, please leave your comments below. I may use these comments in my research.
Recently I presented a paper called ‘Materialising robot platforms’ on the affordances, environments and networks of three Korean service robots. The topic of my paper was something of an outlier in a conference called ‘Platform Politics’ at Anglia Ruskin University, Cambridge, organised by Jussi Parrika and Joss Hands.
Most other papers identified either with political theory and technology, or with platform studies: analysing how the underlying technological infrastructures play out in fostering certain social and political outcomes. My paper was closer to the latter category, examining in particular some of the political implications of technological artefacts: the placement of sensors and motors in robots that respond to touch, allow remote teaching, and bow to indicate subservience.
The conference was video recorded in a pretty rudimentary way using UStream. It is pretty hard to follow the paper from this video. The abstract is below (although of course this doesn’t really reflect what I talked about).
Chris Chesher Research and development in robotics is currently developing a range of network-connected material platforms. This practice is producing robots increasingly tuned towards particular lifeworlds: language teaching robots in classrooms; service robots in public spaces; container-handling robots in ports; rescue robots in earthquake zones, and so on. These specific platforms diverge significantly from the general-purpose robot of popular imagination as robots are made increasingly real as they are themselves formed by their multiple attachments across physical, social and institutional spaces. This paper draws on recent interviews with researchers at the Australian Centre for Field Robotics, and company representatives at the Robotworld tradeshow in Korea. The interviews examine the rhetoric and practices by which robot platforms are increasingly blackboxed as technical innovations in ways that are informed by narratives of the application environments, and strategic connections with institutional networks. A robot platform is constituted by a singular combination of elements: sensors, operating systems, programming and effectors (motors, screens, speakers, etc). However, these components must work together towards creating a robot that can perform as an autonomous
actor in forming relations within specific environments. In talking about the robots, engineers, developers and salespeople often provide rich narratives featuring the robots in particular physical and social environments. Developers are also aware of the institutional connections in operation that will be crucial in securing the robot’s current and future existence. The Korean company Dasarobot’s English language teaching robot must capture the interest of teachers, but outside their direct affiliations with schools. Development communities are establishing core features of contenders for future robot platforms, abstracted below the level of particular applications. For example, many robots use similar autocharging systems to respond autonomously to the common problem of a low battery. Some robots use custom operating systems, while others use open source ROS such as those from Willow Garage and Microsoft. The range of issues in robotic platforms gives the problem of software platforms a material base, as seen in the collaborations and conflicts between key mechatronics disciplines of software engineering, mechanical engineering and electrical engineering. Meanwhile, as robotic platforms stabilise, there are increasing enrolments of other disciplines: media art; media practice; performance; design; marketing; cinema and so on.
Any robot that moves, performs. But those robots that are built or programmed explicitly to perform can accentuate a repertoire of multiply articulated gestures with naturalistic movements and interaction.
One of the hit exhibits at technology trade show CEBit 2011 in Hannover in March 2011 was the performing Robothesbian by Engineered Arts from Cornwall. This gangly robot performer was connected up to a Microsoft Kinect games controller so it could read the body movements of visitors. It has a certain cheekiness, and a Shakespearean repertoire. Its movements are somewhat more explosive than many robots. The designers also exploit lighting and stage sets to good effect.
Robothesbian was built by a company of ten, and engineered over 7 years. At least twenty have been installed, including one at Questacon in Canberra.
Another recent notable robotic performance was at TED, featuring Aldebaran’s NAO playing a stand-up robot comic called Data. He was partnered by Heather Knight from Marilyn Monrobot Labs. Data tells a number of pretty old jokes (but I guess he wasn’t invented yet), and apparently uses software developed at Carnegie Mellon to respond to the audience reactions.
It’s apparent that the audience’s experience of the robot’s performance is distinct from their experience of the uncanny appearance of an ultra-realistic robot such as Hiroshi Ishiguro’s.
At another level, Knight’s use of Nao as Data shows that robotic innovation can legitimately take place in software alone.
This video illustrates another practice in robotics that enhances and distorts the human gait: the exoskeleton. In the tradition of bionics, wearers strap a motorised assemblage to their body, and the device senses nerve signals running through the limbs, and amplifies these into movements. It is designed for people with poor mobility (broken leg, aged etc) and rehabilitation.The “Hybrid Assistive Limb” (HAL) is being developed by Japanese scientists at Cyberdine Corporation and Professor Sankai of Tsukuba University.
The very deliberate, (robotic) gait that wearers adopt when strapped into this is reminiscent of cinematic clichés about how robots move. Rather than allowing movement in-between each step, this device regulates the gait, while giving enhanced strength.
(thanks to Andrew Murphie for the link)
The robot from Cornell University in this video ‘generates a conception of itself’ and improvises ways of moving around. At startup, the design has been left incomplete, and the robot itself finishes the design. As the robot starts up, it moves all its parts to establish its own morphology. If it has been damaged or reorganised, it can adapt to its new body and still improvise getting around.
Unlike the programmed gaits in the previous Following Robots post, this robot belongs to a tradition of self-generative designs. In the documentation, the developers emphasise that this robot generates internal models — diagrams in the robot’s mind that represent its body. The principle of creating mathematical models of the robotic body (and of the artificially intelligent mind) is the dominant approach to designing self-aware autonomous systems.
Against the internal model approach, an alternative view proposes bottom-up designs, such as in Simon Penny’s work (see his paper ‘Trying to be Calm: Ubiquity, Cognitivism and Embodiment’). This tradition critiques the assumption that robotic movement requires models, and that models explain robotic movement and ‘awareness’.
Watching this mangle of motors, sensors and connections struggle to get to its feet, irrespective of the mathematics of its internal model, the information in play clearly comes from the bottom up. The gait is not calculated in the internal model and then applied to the outside. It is generated in the encounter of robot with the gravity-bound world. The model is a vectoral diagram of the forces at play in the robot body, and the ‘model’ is inseparably part of the world.
How a robot walks, runs and jumps is critical to how it moves through its environment. Beyond these instrumental questions, how a robot moves can’t help establishing a sense of its perceived character. We’ve faced these questions of movement, embodiment and identity before — in animation. The problems of designing the gait of robots recalls (and deviates from ) the technique of creating walk cycles in cel animation, which date back to the earliest days of cinema.
Both robotic and animated bodies use rhythms to generate economies in movement. For animators, walk cycles can continue indefinitely to fill any duration in the linear sequence of a final animation. The walk cycle helps establish character by communicating the urgency, competence and mood in the figure’s movements. Shape, style and frame-to-frame changes give the character an implied history by adding deliberate distortions: squashing and stretching the body, and manipulating the apparent forces of acceleration, inertia and gravity on head, torso and limbs.
For the roboticist, a well-designed gait is also economical, because it allows the robot to establish rhythms in movement that maximise its use of energy. A well-tuned gait takes advantage of the dynamics in between the points at which robotic motors activate. It uses the weight and intertia of the robot body to maintain balance and stability at speed. This is inevitably also read by observers as a creating kind of character. The aesthetic inevitably returns.
In designing movement, the animator seems to begin with an aesthetic problem, where the roboticist seems to start with instrumental problems. Of course, the animator must resolve the aesthetic through technical means: whether that is use of cameras and in-betweening, or 3D computer animation (quite similar to robot simulators). The roboticist, on the other hand, cannot escape the aesthetic, as the human eye inevitably reads movement as life and finds a face and character. See also Cholodenko 2007 and Sobchack 2009.
An example of an animated walk cycle.Todd Wheeler
A fast-walking robot built by a researcher in Thailand, Weerayut.
Cholodenko, A., 2007. The Illusion of Life 2: More Essays on Animation, Power Publications.
Sobchack, V., 2009. Animation and automation, or, the incredible effortfulness of being. Screen, 50(4), pp.375 -391.
Van Breemen, A.J.N., 2004. Bringing robots to life: Applying principles of animation to robots. In Proceedings of Shapping Human-Robot Interaction workshop held at CHI.
‘But you can’t always be there…’
The mobile sensing system Mobileye uses a single camera, mounted on the windscreen, to judge whether the car is drifting out of the lane, or about to hit a vehicle, pedestrian, or kangaroo. It can give up to 2.7 seconds warning if it calculates there is a potential collision. But it may not see a wombat (too short), according to the person presenting this product at the Australian Centre for Field Robotics this morning.
This $2000 device from Israel seems most often to be designed to correct the driving behaviour of other people. It does so by beeping whenever it senses poor driving. You can avoid the warning beep by using the indicators properly before crossing the lane. In the advertisement above, it’s this woman’s rogue texting that would have made her smash her car. Luckily the Mobileye operates as the surrogate eye of the father, and warns her of this danger. Another common use case is in truck fleets. If the device reports too many times when the truck drifted out of the lane, the driver can be fired. ‘if you have a serial tailgater you can prove it and dismiss him for it’.
Some car manufacturers are already using Mobileye in production cars: BMW is using it only for keeping their drivers in the lane, but not for collision monitoring. Volvo and GM are working on a system that will actually read road signs, and give their speeding drivers warning.
The researchers at the ACFR were interested in adapting the Mobileye for use on robots. One of the limitations of the system is that it won’t recognise obstacles unless the vehicle is moving at over 5kmh (presumably stereo-optical systems would work better). This is unfortunate, because robots have a problem with starting moving. The researchers also needed to access the Mobileye as a data source, and not as a black-boxed commodity system. They need information, not beeping. The presenter assured the roboticists that it should be possible to hack the eye so that it would be useable in that way (but they would need to contact the head office).
Most robots I’ve seen move at a very deliberate pace. The computational challenge of processing multiple signals, and deciding what to do next (while not draining the battery too much) mean that most research robots take a long time to do pretty much anything. It is common for engineers to speed up the video of their projects to make watching them tolerable.
The videos on this recent post on BotJunkie shows that snail-like speed is not necessarily the feature of all semi-autonomous and autonomous robots. These ‘sumos’ are specialised fighting robots under 3kg, that need to work really fast, for a short time.