Researchers at the Queensland University of Technology are  developing new more reliable Global Positioning Systems (GPS) using camera technology and mathematical algorithms that would make navigating a far cheaper and simpler task.

 

The world-first approach to visual navigation algorithms, which has been dubbed SeqSLAM (Sequence Simultaneous Localisation and Mapping), uses local best match and sequence recognition components to lock in locations.

 

Dr Michael Milford from QUT’s  Science and Engineering Faculty said that at the moment three satellites are needed to get a decent GPS signal and even then it can take a minute or more lock in a location.

 

"There are some places geographically, where you just can't get satellite signals and even in big cities we have issues with signals being scrambled because of tall buildings or losing them altogether in tunnels."

 

"SeqSLAM uses the assumption that you are already in a specific location and tests that assumption over and over again.

 

"For example if I am in a kitchen in an office block, the algorithm makes the assumption I'm in the office block, looks around and identifies signs that match a kitchen. Then if I stepped out into the corridor it would test to see if the corridor matches the corridor in the existing data of the office block lay out.

 

"If you keep moving around and repeat the sequence for long enough you are able to uniquely identify where in the world you are using those images and simple mathematical algorithms."

 

Dr Milford said the "revolution" of visual-based navigation came about when Google took photos of almost every street in the world for their street view project.

 

However, the challenge was making those streets recognisable in a variety of different conditions and to differentiate between streets that were visually similar.

 

The research, which utilises low resolution cameras, was inspired by Dr Milford's background in the navigational patterns of small mammals such as rats.

 

"My core background is based on how small mammals manage incredible feats of navigation despite their eyesight being quite poor," he said.

 

"As we develop more and more sophisticated navigation systems they depend on more and more maths and more powerful computers.

 

"But no one's actually stepped back and thought 'do we actually need all this stuff or can we use a very simple set of algorithms which don't require expensive cameras or satellites or big computers to achieve the same outcome?'"

 

Dr Milford will present his paper SeqSLAM: Visual Route-Based Navigation for Sunny Summer Days and Stormy Winter Nights at the International Conference on Robotics and Automation in America later this year.

 

The research has been funded for three years by an Australian Research Council $375,000 Discovery Early Career Researcher Award (DECRA) fellowship.