Google is one of the main contenders when it comes to developing a self-driving car.
The company has previously revealed plans for its cars to be able to predict what drivers are going to do next, as well as drive slowly around children.
Now, according to a patent filed by Google, the firm is working on a way to allow its smart cars to get out of the way of police and other emergency vehicles.
'A system and method is provided for detecting and responding to emergency vehicles,' the patent application says.
'In one aspect, one or more computing devices may identify a set of light sources from an image based at least in part on one or more templates, and may filter the set of light sources in order to identify one or more light sources corresponding to a potential emergency vehicle.'
The technology uses filters to detect the red and blue flashing lights used by emergency services.
Software will also determine whether the lights are flashing or not, and will be able to identify which form of vehicle is approaching by noting the spacing of the lights.
However, as with the raft of patents granted to Google, it is unclear if the technology will come to fruition at this stage.
The car will then move out of the way of the emergency vehicle. In doing so, it will make sure it does not create an accident by driving itself into a tree or off a bridge, for example.
'Autonomous vehicles may...use the cameras, sensors, and global positioning devices to gather and interpret images and sensor data about its surrounding environment, for example, oncoming vehicles, parked cars, trees, buildings' the patent says.
Google has been working on its smart vehicle since 2009, with testing being carried out on the roads around its Mountain View headquarters in California.
Google says the technology will help reduce delays for the emergency service and cut down on the possible risk of crashes involving its own cars.
A study last year found self-driving cars are far more accident-prone than ordinary cars, a new study has claimed.
Researchers found self-driving vehicles had 9.1 crashes, compared to just 1.9 for those with a human operator.
However, the report found that none of the accidents were the fault of the self driving car.
Self-driving cars were also rear-ended 50 per cent more often than traditional vehicles, it found.
However, the University of Michigan's Transportation Research Institute concluded 'self-driving vehicles were not at fault in any crashes they were involved in.'
Google's self driving software will be considered a driver
U.S. vehicle safety regulators have said the artificial intelligence system piloting a self-driving Google car could be considered the driver under federal law, a major step toward ultimately winning approval for autonomous vehicles on the roads.
The National Highway Traffic Safety Administration told Google, a unit of Alphabet Inc, of its decision in a previously unreported Feb. 4 letter to the company posted on the agency's website this week.
Google's self-driving car unit on Nov. 12 submitted a proposed design for a self-driving car that has 'no need for a human driver,' the letter to Google from National Highway Traffic Safety Administration Chief Counsel Paul Hemmersbaugh said.
'NHTSA will interpret 'driver' in the context of Google's described motor vehicle design as referring to the (self-driving system), and not to any of the vehicle occupants,' NHTSA's letter said.
'We agree with Google its (self-driving car) will not have a 'driver' in the traditional sense that vehicles have had drivers during the last more than one hundred years.'
Major automakers and technology companies such as Google are racing to develop and sell vehicles that can drive themselves at least part of the time.
All participants in the autonomous driving race complain that state and federal safety rules are impeding testing and eventual deployment of such vehicles.
California has proposed draft rules requiring steering wheels and a licensed driver in all self-driving cars.
Karl Brauer, senior analyst for the Kelley Blue Book automotive research firm, said there were still significant legal questions surrounding autonomous vehicles.
But if 'NHTSA is prepared to name artificial intelligence as a viable alternative to human-controlled vehicles, it could substantially streamline the process of putting autonomous vehicles on the road,' he said.
If the car's computer is the driver for legal purposes, then it clears the way for Google or automakers to design vehicle systems that communicate directly with the vehicle's artificial pilot.
In its response to Google, the federal agency offered its most comprehensive map yet of the legal obstacles to putting fully autonomous vehicles on the road.
It noted existing regulations requiring some auto safety equipment can not be waived immediately, including requirements for braking systems activated by foot control.
'The next question is whether and how Google could certify that the (self-driving system) meets a standard developed and designed to apply to a vehicle with a human driver,' NHTSA said.
Google is 'still evaluating' NHTSA's lengthy response, a company spokesperson said on Tuesday.
Google executives have said they would likely partner with established automakers to build self-driving cars.
Google told NHTSA that the real danger is having auto safety features that could tempt humans to try to take control.
Google 'expresses concern that providing human occupants of the vehicle with mechanisms to control things like steering, acceleration, braking... could be detrimental to safety because the human occupants could attempt to override the (self-driving system's) decisions,' the letter stated.
In the past two years, 23 states have introduced legislation that affect self-driving cars, 'all of which include different approaches and concepts,' he noted.
Five states have passed such legislation, all with different rules, Urmson said.
'If every state is left to go its own way without a unified approach, operating self-driving cars across state boundaries would be an unworkable situation and one that will significantly hinder safety, innovation, interstate commerce, national competitiveness and the eventual deployment of autonomous vehicles,' Urmson said in his prepared testimony.
He also cited government statistics showing 38,000 people were killed last year in US road accidents and that '94 percent of those accidents involve human error.'
Joseph Okpaku, vice president of government relations for the ridesharing group Lyft, echoed those comments, saying consistent rules would be important for the planned deployment of self-driving cars by Lyft and GM.
'We are on the doorstep of another evolutionary leap in transportation and technology, where concepts that once could only be imagined in science fiction are on the verge of becoming a reality,' he said.
'The worst possible scenario for the growth of autonomous vehicles is an inconsistent and conflicting patchwork of local, municipal and county laws that will hamper efforts to bring AV (autonomous vehicle) technology to market,' Okpaku added.
'Regulations are necessary, but regulatory restraint and consistency is equally as important if we are going to allow this industry to reach its full potential.'
GM vice president Michael Ableson said the auto giant 'enthusiastically supports policy initiatives to accelerate the development and adoption of safe, high-level vehicle automation.'
Delphi vice president Glen De Vos added that 'uniform rules that allow for the safe operation of driverless vehicles in all 50 states will be critical.'
But the Senate panel was told to exercise caution by Mary Cummings, who heads the Humans and Autonomy Laboratory at Duke University.
'There is no question that someone is going to die in this technology,' she said
'The question is when and what can we do to minimize that.'
Cummings said it's not yet clear that self-driving cars can safely operate in all situations.
'We know that many of the sensors on self-driving cars are not effective in bad weather, we know people will try to hack into these systems,' she told the panel.
Cummings said it is possible to 'spoof' a car's GPS to send it off course, or to use laser devices to trick a vehicle into sensing objects which are not there.
She said a Rand Corporation study said that self-driving cars would need to drive 275 million miles (442 million kilometers) to show they are as safe as human-operated vehicles.
Cummings said the federal government needs to ensure that testing is done in a rigorous way to ensure safety.
'I am wholeheartedly in support of the research and development of self-driving cars,' she said.
'But these systems will not be ready for fielding until we move away from superficial demonstrations to principled, evidenced-based tests and evaluations.'
The activist group Consumer Watchdog warned meanwhile that the federal government should not take shortcuts on safety by 'rushing new technology to the roads.'
'Federal regulators have a process for writing rules to keep the public safe,' Consumer Watchdog's John Simpson said in a statement.
'Congress shouldn't skirt those rules just because tech industry giants like Google ask them to.'
Google self-driving trucks use smart lockers to pick up goods
Google is developing both delivery drones and self driving cars - but a new patent reveals it is also building a smart delivery truck. The patent reveals plans for driverless trucks with lockers inside the cargo area and a pin code that grants customers access to their packages.Customers would receive a message when the vehicle is nearby - meaning the end of uncertainty over delivery times.
The Autonomous Delivery Platform patent, first reported by Quartz, describes locker-like containers in the cargo area, which the receivers would type in a code or scan an NFC chip to claim their packages.
There is also the option of possibly using the credit card that purchased the packages to open the locker.
‘An autonomous road vehicle is operative to receive destination information, and to drive to a destination based on the destination information,’ reads the patent.‘A package securing subsytem is attached to the autonomous road vehicle and comprises at least one securable compartment.’
The document describes the use sensors, video cameras and range-finding lasers as a way for trucks to navigate roads and obstacles that may get in its path.
‘Automated road vehicles can use various sensors, for example, video cameras, radar sensors and laser range finders, to “see” other traffic, as well as detailed maps to navigate a road, and a communication subsystem, such as a wireless communication subsystem, to communicate with a controller and other entities.’Customers would receive a notification when the truck is about to arrive and if t is late, the truck will also let them know via text.
Google has also suggested that customers could pay for their packages when they receive them, turning the self-driving trucks into a vending machine on wheels.Some aren’t too surprised to hear the news about self-driving delivery trucks, as Google has been working on its own self-driving car since 2009 that should on the market by 2020.