ZB ZB
Live now
Start time
Playing for
End time
Listen live
Listen to NAME OF STATION
Up next
Listen live on
ZB

The idiot’s guide to the future of tech

Author
Steph Rowe,
Publish Date
Mon, 21 Nov 2016, 8:38AM
(Steph Rowe).
(Steph Rowe).

The idiot’s guide to the future of tech

Author
Steph Rowe,
Publish Date
Mon, 21 Nov 2016, 8:38AM

Let’s get something straight, I’m no expert on this. I have a general interest in technology but was certainly a little skeptical when I headed along to the Huawei Asia-Pacific Innovation Day in Sydney. Reading that title you may; like me, have had an image of space-age stuff that is irrelevant in our day-to-day lives. I’m relieved to say I was wrong.

The strangest and most interesting thing I took away from this day, was how something like inseminating a cow has been completely revolutionised by the use of a pedometer. The team at Fujitsu attached pedometers to cows and discovered that their steps while in heat increased exponentially. In the 16-18 hour peak fertility period, a farmer can inseminate the cow at the exact moment they need. Adding to that, if the cow is inseminated early in their fertile window there is a 30 per cent higher chance of that cow’s calf being female.

Isn’t it fascinating that with all the new technology surrounding us, something as old as a pedometer can achieve such breakthrough results? These pedometers have managed to bring a 400 day calving cycle down to 350 days. No doubt Fitbit have their next Friesian-coloured bands in the pipeline.

How about robots that can see and think? Take, for example, growing grapes in a vineyard. Imagine having a robot that could roam the fields and compare images taken from that field to its on-board database that would reveal if a plant needed more water, pesticides, sunlight etc. The new focus is how we can start creating more product from the same resources. If you’d like to know more, look up the University of Sydney’s research on the Shrimp and the Ladybird agricultural robots.

This same sort of technology could be implemented with something like IVF. A robot’s constant observation of embryos in the cycle of an IVF treatment could help technicians pick up on early warning signs they might have otherwise overlooked. 

These robots need to see an image and ‘caption’ it to know what is happening and what course of action to take. That’s where the scientists with their algorithms come in and where my explanation of it all starts to fragment.

With something like a self-driving car, this process is the difference between life or death. The car needs to be able to see an image of a traffic light and come up with a caption to decide what action it needs to take—for example a red light to stop, a green light to go. 

We’re not quite there yet. Fortunately or unfortunately. My belief is when these sorts of projects become common place, and we begin to hand the reins over to robots in particular areas, we’ll pick up the reins in other areas. It’s what we do. It’s what we’ve always done. And so we evolve a little further.

 

This trip was made possible by Huawei. 

Take your Radio, Podcasts and Music with you