你可以说我老土,但自动驾驶这个概念实在是没办法说服我;我想象过当我在一座像波士顿那样的大城市街道上开着车,看看隔壁车道上的另一辆车,然后发现驾驶座上没人──在波士顿开车已经让人心情够糟,我恐怕会忍不住大吼大叫。
虽然我对自动驾驶抱持高度怀疑,我却意外对自动停车(autonomous parking)这个点子十分着迷;试想,当你从驾驶座走下来、只消对车子说:“去找个地方停着!”就可以径自潇洒走向准备前往的餐厅,那会是件多么爽的事情。如此我们再也没有约会迟到的借口,或是因为找不到停车位而不想出门。
包括 Google 以及车厂 Audi 、 Volvo 都曾展示过与自动驾驶相关的技术研发成果;其中Audi在今年初的国际消费性电子展(CES)上,也展示过配备自动停车功能的汽车概念。而Volvo则是在上个月公开了一段视频,展示其最新自动停车技术。
这家瑞典车厂表示,其首款自动停车汽车原型,能“安全且顺畅地与其它车辆或行人互动”,会自己导航并进入停车场,并寻找空位、自己停进去。
当 然,任何一种新科技问世时,不会有人相信那些供货商的表面说法;举例来说,Volvo的最新汽车系统是以其Vehicle 2 Infrastructure (V2I)为基础,也就是说,该系统需要在一个配备嵌入式信号发射器的停车场架构才能运作,你不可能真的完全放任车子自己出去找车位。
这真的很令人失望,因为我一开始对自动停车概念着迷,就是错误假设我不必再大街小巷四处绕着找停车位…那时我到底在想啥?
想 想看需要多久的时间,才能让每一座停车场与所有街道空间都配备可无线交换所有必要车辆资料的智能型基础设施(像V2I那种)?恐怕比你想象得来得更久。虽 然V2I技术的主要设计目的是避免或减少交通事故,Volvo表示其原型车辆不会仰赖该技术来避开行人或是横冲直撞的购物车;该系统会藉由汽车内建的传感器来避免类似的状况。
我 可以想象,打造拥有“眼睛”的车辆,对那些供应汽车厂商的电子业者来说,会是一笔多大的生意。根据市场研究机构IHS在4月份发表的报告,包括车道偏离警告、自动停车等应用,将会成为今年嵌入式视觉市场的主要成长推手;所谓的嵌入式视觉技术能协助机器观看并解释来自计算机视觉软件的资料。
本文授权编译自EE Times,版权所有,谢绝转载
本文下一页:能让车子“看见”的基础技术有哪些?
相关阅读:
• 机器人已可替代人类做危险试驾测试
• 美国开放无人驾驶车测试,挂牌上路还需时日
• “联网汽车”的终极进化是自动驾驶HtDesmc
{pagination}
根据IHS估计,“汽车应用的特殊目的计算机视觉处理器”市场营收,将由2011年的1.26亿美元、2012年的1.37亿美元,在2013年成长至1.51亿美元;该数字并将在2016年进一步达到1.87亿美元,前景看好。
但到底能让车子“看见”的基础技术有哪些?自动停车还面临那些挑战?对此赛灵思(Xilinx)的全球汽车市场行销与产品企划资深经理Kevin Tanaka表示:“虽然已经有像是V2I这样的技术开始发展,但在全球市场非常罕见,并没有一家汽车制造商在现阶段真的仰赖该类技术提供自动停车功能。”
他说,目前在德国市场有一些让汽车与电子化停车场通讯的试验,能告诉汽车有哪些车位是空的,但:“那都还在非常非常早 期的研究阶段。”此外Google也有自动驾驶车辆,不过:“并非利用V2I而是大量的感测装置,包括摄影机、雷达、光学定向与测距(lidar),以及 一些超音波、地图绘制程式。”
Tanaka进一步指出,在今日的量产方案中,车厂则是使用结合雷达、超音波与摄影机的方案: “处理传感器资料、执行算法,并在最终协调变速箱、方向盘、油门以及煞车控制,需要难以想象的大量平行处理效能。”而他表示,这也是此应用领域对可编程SoC有庞大需求的原因,未来也将继续扮演要角。
本文授权编译自EE Times,版权所有,谢绝转载
编译:Judith Cheng
参考英文原文:Can a Car Find a Parking Spot by Itself?,by Junko Yoshida
相关阅读:
• 机器人已可替代人类做危险试驾测试
• 美国开放无人驾驶车测试,挂牌上路还需时日
• “联网汽车”的终极进化是自动驾驶HtDesmc
{pagination}
Can a Car Find a Parking Spot by Itself?
Junko Yoshida, Chief International Correspondent
MADISON, Wis. -- Call me old-fashioned, but autonomous driving is a concept that has never fired my loins. I picture myself driving on a street in a big city -- say, Boston. I look at the car in the next lane, only to realize that there's no driver there. I mean, it's bad enough driving in Boston. I want somebody I can yell at.
So, though I'm stubbornly dubious of autonomous driving, I find myself strangely fascinated with the idea of autonomous parking. Imagine how liberating it would be just to tell your car, "Park somewhere," while you jump out of the driver's seat and head for Legal Sea Foods. There'd be no more late arrivals and lame excuses about not finding a spot.
Everyone from Google to Audi and Volvo has been showing off research and development efforts and results for autonomous driving. Audi showed its version of a self-parking car at the International Consumer Electronics Show this year. Volvo released a video clip last month showcasing its new autonomous parking technology. The Swedish automaker says this is the first self-parking prototype that "interacts safely and smoothly with other cars and pedestrians."
Autonomous Parking
Volvo says its prototype enters and navigates a car park and then finds and parks in an available spot.
Of course, when any new technology comes out, nobody takes vendors' claims at face value. For example, Volvo's latest system relies on what it calls Vehicle 2 Infrastructure (V2I) technology. In other words, the system requires a parking structure that comes with embedded transmitters. You can't send your Volvo off to hunt for a parking spot totally on its own.
To me, this is a huge disappointment. I was initially warm to the self-parking concept, because I erroneously assumed that I would no longer have to drive around block after block to find a spot. What was I thinking?
Imagine how long it would take to equip each and every parking lot and street space with a smart infrastructure (like V2I) capable of wirelessly exchanging necessary data with all vehicles. An eternity? Maybe longer.
Though the V2I technology is designed primarily to avoid or mitigate crashes, Volvo says its prototype car doesn't rely on V2I for avoiding pedestrians or rogue shopping carts. Rather, its system comes with in-vehicle sensors to avoid such objects.
I do know that the notion of cars with eyes is a huge deal for the electronics industry serving automotive companies.
The market research firm IHS said in April that automotive applications such as lane departure warnings and self-parking will be among the major growth drivers this year for the embedded vision market -- technology that helps machines see and interpret data from computer vision software.
Revenue from "special-purpose computer vision processors used in under-the-hood automotive applications" rose from $126 million in 2011 to $137 million last year and should reach $151 million this year, IHS said. That revenue will keep expanding and will hit $187 million by 2016, "confirming the solid prospects in store for embedded vision, one of the fastest-growing trends in technology."
But exactly what are the basic technology building blocks involved in making a blind car see? And what challenges does an self-parking car still face?
I popped a few questions on the topic to Kevin Tanaka, senior manager of worldwide automotive marketing and product planning at Xilinx. "While there is the start of some V2I out there, it's still very limited worldwide at the moment, so no automaker is really relying on that for autonomous parking in its current state," he told me. There are some trials in Germany right now in which the vehicle communicates with an electronic parking lot, which tells it what spaces are open. "But it's, again, very, very early."
There's also the Google self-driving car, "but that also does not utilize V2I, but rather a huge range of sensors." They include "cameras, radar, lidar (a remote sensing technology) and some ultrasonics and mapping programs."
In production setups today, OEMs are using combinations of radar, ultrasonic devices, and cameras. "There is an incredible amount of parallel processing power that needs to be done to process the sensor data, run algorithms, and ultimately coordinate gearbox, steering, acceleration, and braking controls."
This obviously is part of the reason why programmable SoCs are in huge demand these days for cars with eyes. Tanaka said Xilinx Automotive FPGAs and Zynq-7000 All Programmable SoCs are being used in many of the radar and camera programs right now, and they will continue to be used into the future.
责编:Quentin