At present, autopilot has developed rapidly. In addition to Tesla and Audi, many luxury cars have introduced autopilot technology. Here's a look at the automated driving industry chain: First look at the standard of automatic driving: 1. Intel: Mobileye+Altera+Movidius Intel completed the layout in the area of ​​automated driving mainly through mergers and acquisitions: in June 2015, US$16.75 billion acquisition of FPGA giant Altera; in September 2016, acquisition of computer vision processing chip company Movidius; March 2017, 15.3 billion acquisition of Israeli self-driving automotive technology Company Mobileye. Through the above acquisition, Intel's layout on the autopilot processor has been more complete, including Mobileye's EyeQ series chip (ASIC), Altera's FPGA chip, Movidius's visual processing unit VPU, and Intel's CPU processor, which can form autonomous driving. The overall hardware solution. EyeQ1: Mobileye's EyeQ series chips were originally developed jointly with STMicroelectronics. The first-generation chip EyeQ1 was developed from 2004 and listed in 2008; EyeQ2: EyeQ2 went public in 2010. The first two generations of products only provide L1 driver assistance. The EyeQ1's power is about 0.0044 Tops, and the EyeQ2 is about 0.026 Tops. The power consumption is 2.5W. EyeQ3: The EyeQ3, which was mass-produced in 2014, was developed based on its own ASIC architecture. It uses four MIPS core processors and four VMP chips. The floating-point operation is 0.256 trillion times per second, and the power consumption is 2.5W, which can support L2. Advanced driver assistance computing needs. EyeQ4: The fourth-generation EyeQ4 chip was released in 2015 and will be launched in volume in 2018, using 28nm technology. The EyeQ4 uses five core processors (four MIPSi-class cores and one MIPSm-class core), six VMP chips, two MPC cores, and two PMA cores to process image data from eight cameras at the same time. Floating-point operations up to 2.5 trillion times per second, power consumption is 3w, the maximum can achieve L3 semi-automatic driving function. EyeQ5: Mobileye's next-generation EyeQ5 plans to produce engineering samples in 2018 and mass production by 2020, using the 7nm FinFET process. The product is targeted at Nvidia's DriveXavier chip and is positioned on the L4/L5 fully automatic driving calculations. The single-chip floating-point computing capability is 12 Tops and the TDP is 5W. The EyeQ5 system uses a dual CPU, uses 8 core processors, and has 18 core visual processors. The floating-point computing capability is 24 Tops and the TDP is 10W. It is said that Mobileye's chip price also exceeds 1,000 US dollars. 2. Nvidia: DrivePX Series Chips DrivePX: The Nvidia Autopilot Chip started with the DrivePX series, which was introduced in early 2015. In January 2015, NVIDIA announced the first-generation DrivePX. DrivePX equipped with TegraX1 processor and 10GB of memory, can simultaneously process 12 200-megapixel camera shooting images per second 60 frames, single floating-point computing capacity of 2Tops, deep learning computing capacity of 2.3Tops, can support L2 advanced assisted driving computing needs . DrivePX2: At CES in January 2016, Nvidia also released a new generation of DrivePX2. The DrivePX2 is based on a 16nm FinFET process and has a TDP of 250W. It uses a water-cooled heat sink design and supports 12-way camera input, laser positioning, radar, and ultrasonic sensors. Among them, the CPU part is composed of two NVIDIA Tegra2 processors, each CPU contains 8 A57 cores and 4 Denver cores; The GPU part adopts two GPUs based on NVIDIA Pascal architecture design. The single-precision calculation capability reaches 8TFlops, the deep learning calculation capability reaches 24 trillion times per second, and the single-precision operation speed is 4 times that of the DrivePX, and the deep learning speed is 10 times that of the DrivePX, which can satisfy the operation requirements of the L3 automatic driving. DriveXavier: DriveXavier is Nvidia's latest-generation autopilot processor. It was first introduced at the 2016 European GTC Conference and was officially released at CES in January 2018. At the same time, the world's first on-board computer DrivePXPegasus built for unmanned taxis was also released. In terms of configuration, Xavier is built on a specially-customized 8-core CPU, a new 512-core Volta GPU, a new deep learning accelerator, a new computer vision accelerator, and a new 8K HDR video processor. It can run 30 trillion calculations per second, power consumption is only 30W, energy efficiency is 15 times higher than the previous generation architecture, and it can meet the computing needs of L3/L4 automatic driving. The product is expected to provide samples in the first quarter of 2018. DrivePXPegasus: DrivePXPegasus is an AI processor for L5 fully-automated taxis, equipped with two Xavier SoC processors. The SoC's integrated CPU has also changed from 8 cores to 16 cores, while adding 2 separate GPUs. The calculation speed reaches 320 Tops, which is equivalent to 10 times that of PXXavier. The computing power can support the L5 fully automated driving system, but its power consumption also reaches 500W. It is expected that the first samples will be delivered to customers in mid-2018. It is said that the price of Nvidia DrivePX2 is more than 10,000 US dollars. 3. Qualcomm & NXP NXP As an absolute leader in the field of mobile communications, Qualcomm has been hoping to cut into the automotive electronics industry with its own mobile processor chip (which has been converted into a vehicle specification). At the beginning of 2016 at CES, Qualcomm released the Snapdragon 820 series of automotive products that integrate LTE modems and machine intelligence. This series of products includes Qualcomm's Zeroth machine intelligence platform, designed to help automakers use neural networks to create deep learning-based solutions for ADAS and in-vehicle infotainment systems. However, the current depot design orders are still limited to information and entertainment functions; the domestic ADAS manufacturer Inspur launched the first ADAS prototype based on the 820A platform and applied deep learning at the CES in 2017 and was officially released in December. It is reported that this product has entered the pre-production verification stage and is expected to be mass-produced in 2019. As an automotive electronics manufacturer, NXP’s accumulation in the direction of autonomous driving is much deeper than that of Gaotong. In May 2016, NXP announced the BlueBox platform, which integrates the S32V234 automotive vision and sensor fusion processor, the S2084A embedded computing processor and the S32R27 radar microcontroller, providing automotive manufacturers with L4 automatic driving computing solutions. Among them, S32V234 is the ADAS processing chip introduced in 2015 of NXP S32V series products, on the BlueBox platform responsible for visual data processing, multi-sensor fusion data processing and machine learning. This chip has a CPU (4 ARM Cortex A53 and 1 M4), 3DGPU (GC3000) and visual acceleration unit (2 APEX-2 vision accelerator), can support 4 cameras at the same time, GPU can be 3D modeling in real time, computing power is 50GFLOPs. At the same time, the S32V234 chip has reserved interfaces that support millimeter-wave radar, laser radar, and ultrasonic waves, enabling multi-sensor data fusion and up to ISO26262ASIL-C standards. NXP also has a dedicated radar information processing chip MPC577XK. This is a Qorivva 32-bit MCU for ADAS applications. Based on the Power architecture, it supports applications such as adaptive cruise control, smart headlight control, lane departure warning, and blind spot detection. 4. Renesas Renesas Similar to NXP, Renesas also released an ADAS and Autonomous Driving Platform Renesas Autonomy in April 2017. It aims to attract more Tier 1 suppliers to expand the ecosystem. Also announced is the R-CarV3MSoC, which is equipped with two ARM Cortex A53, dual CortexR7 lock-step cores, and an integrated ISP to meet the ASIL-C level functional safety hardware requirements for smart cameras, panoramic view systems and Radar and many other ADAS applications are expanded. According to reports, samples of the R-CarV3MSoC will be available starting in December 2017 and production will begin in June 2019. Judging from Renesas's chip series, the R-Car series is its main product line in the direction of automatic driving: The first-generation product (R-CarH1/M1A/E1) was launched in 2011-12 to support primary cruise functions; The second-generation product (R-CarH2/M2/E2) almost doubles the performance of the first generation and can support ADAS functions such as 360° look-around; The third-generation products (R-CarH3/M3) have been introduced after 2015 and meet the ASIL-B class safety requirements. At the same time, ASSP processors such as R-CarV3M and R-CarV2H have been introduced. These products can basically support L2 level autopilot application requirements. In addition to R-Car products, like NXP, Renesas also has specialized processor chips for radar sensors such as the RH850/V1R-M series. This product uses a 40nm embedded eFlash technology, optimized DSP can quickly perform FFT deal with. 5. Texas Instruments TI TI's product line on ADAS processing chips is mainly the TDAx series. Currently, there are three chips such as TDA2x, TDA3x, and TDA2Eco. TDA2x: TDA2x was released in October 2013. It is mainly targeted at middle to mid-to-high-end markets. It has 2 ARM Cortex-A15 cores and 4 Cortex-M4 cores, 2 TI fixed-point C66x DSP cores, and 4 EVE visual accelerator cores. And dual-core 3DGPU. TDA2x is mainly front-facing camera information processing, including lane warning, collision detection, adaptive cruise, and automatic parking system. It can also come out with multi-sensor fusion data. TDA3x: Released in October 2014, TDA3x is targeted at mid to low-end markets. It includes a dual core A15 and SGX544 GPU, retaining the C66xDSP and EVE visual accelerator cores. From a functional perspective, TDA3x is mainly used in rear camera, 2D or 2.5D look around. TDA2Eco: TDA2Eco is another ADAS processor for low-to-medium market released in 2015. Compared to TDA2x, TDA2Eco removes the EVE accelerator and retains one Cortex-A15, four Cortex-M4, DSP, GPU and other cores. . TDA2Eco supports high-definition 3D panoramic view. Since TDA3x is mainly used in 2D or 2.5D look-around, TDA2Eco fills the needs of medium and low-level markets for high-definition 3D panoramic look-around applications. 6, ADI Compared to the above several chip companies, ADI's strategy on the ADAS chip is cost-effective. For high-, medium-, and low-grade vehicles, ADI launched one or more ADAS technologies for targeted implementation to reduce costs. In the visual ADAS, ADI's Blackfin series processors are widely used, among which the low-end system is based on the BF592 and the LDW function is implemented; the middle-end system is based on the BF53x/BF54x/BF561, and the LDW/HBLB/TSR functions are implemented; the high-end system is based on the BF60x. Pipeline vision processor (PVP) is used to implement functions such as LDW/HBLB/TSR/FCW/PD. The integrated vision preprocessor can significantly reduce the burden on the processor, thereby reducing the processor's performance requirements. 7. Infineon Infineon Infineon introduced the Real33D chipset for the ADAS market in 2015, which enables driver fatigue detection and other functions. In the zFAS automatic driving calculation unit used in the new Audi A8, the Aurix chip provided by Infineon is also used, and the A8's most crucial TrafficJamPilot is finally realized by this chip. 8, Toshiba (Toshiba) In July 2017, Toshiba announced that it will jointly launch a video-based active security system with Denso. The system is equipped with Toshiba's latest Visconti4 autopilot chip, Visconti4 has 8 multimedia processing cores, can execute 8 applications at the same time, and is optimized for autopilot video applications. The recognition speed is reduced from 100 milliseconds to 50 milliseconds. Visconti4 can build lane departure warnings, front and rear collision warnings, front and rear pedestrian collision warnings, traffic signs and signal recognition. Denso began to apply Visconti2 to assisted driving in 2015. In addition to doubling the number of audits, Visconti4 improved the pedestrian recognition algorithm much better than the previous generation. The Visconti4 with enhanced CoHOG recognition algorithm has greatly improved the dark scene. The ability to recognize pedestrians and cyclists. 9, Xilinx (Xilinx) The most widely used Xilinx product in automotive ADAS is the Zynq®-7000 All Programmable SoC. The system (SoC) platform helps system vendors accelerate the development of ADAS applications such as surround vision, 3D surround vision, rearview camera, dynamic calibration, pedestrian detection, rear view lane departure warning, and blind spot detection. Zynq uses a single chip to complete the development of the ADAS solution. Xilinx also partnered with Mentor, the Siemens Business Unit, to launch the DRS360 automated driving platform. 10. STMicroelectronics (ST) In 2017, STMicroelectronics also introduced the Telemaco 3P, the industry's first on-board microprocessor that integrates a dedicated fully-isolated hardware security module (HSM). One of the major obstacles to the development of car networking is information security. The consequences of a hacking or disturbing communication on a high-speed moving vehicle such as a car can be disastrous. Therefore, whether it is mobile video entertainment, geographic information-based rescue services, or the recent hot software over-the-air (OTA), the basis for the popularity of these functions is whether the car can transfer information in a timely, effective and safe manner. The HSM checks and secures the external information it receives. The uncertified information and external devices cannot communicate with the protected module. The information sent by Telemaco 3P is also encrypted by the HSM, and dedicated hardware modules are used for security management. Will greatly enhance the safety of vehicle-mounted communications. In addition, the machine vision chip EyeQ5, which ST developed in cooperation with Mobileye, is equipped with eight multi-threaded CPU cores and is equipped with 18 Mobileye's next-generation vision processors. 11, Horizon Robotics (HorizonRobotics) Horizon's Autopilot AI chip "Journey" was officially released on December 20 last year. In terms of parameters, the journey can achieve power consumption of 1Tflops with a power consumption of 1.5W, 30 frames of 4K video per second, and recognition of over 200 objects in the image, enabling high-level assistive driving functions such as FCW/LDW/JACC. To meet the computational needs of L2. Compared with Nvidia's DrivePX2, it adopts 16nm FinFET technology, with single-precision computing capability of 8TFlops, deep-learning computing capability of 24TFlops, and official TDP of 250W. From the perspective of performance/power ratio, the journey still has obvious advantages. At the same time, because ASICs are not general-purpose computations for GPUs, internal algorithms are packaged directly, and data exchange is only underlying I/O, so its computational delay is also lower than that of GPUs. However, the adoption of ASIC in the horizon also sacrifices the programmability of the chip to obtain higher performance. It is worth noting whether sufficient orders can be obtained to reduce the cost of the chip. 12. Cambricon in China The Cambrian released the 1M smart processor IP product for smart driving for the first time at the press conference in early November last year. According to reports, the performance of the Cambrian 1A processor can reach 10 times. It is understood that the theoretical peak performance of the 1A processor marketed in 2016 at 1Ghz frequency is: FP16 half-precision floating point calculation capability is 512GFlops, sparse neural network computing capability is 2TFlops. 13. NavInfo New NavInfo acquired Mayfa Technology, an automotive semiconductor company owned by MediaTek, in May 2016. The latter exhibited the first car-level ADAS chip at CES Asia in June 2017. NavInfo officially released the ADAS chip in July of last year, and has reached cooperation with new vehicle manufacturers such as Weilai, Weimar, and Aiqiyiwei. According to public information, the chip adopts 64-bit QuadA53 architecture, built-in hardware image acceleration engine, support for dual-channel high-definition video output, and four-channel high-definition video input, which can simultaneously support the full functions of advanced car audio and video entertainment systems and rich ADAS functions. Features include: 360° panoramic parking system, lane deviation warning system LDW, frontal collision warning system FCW, pedestrian collision warning system PCW, traffic sign recognition system TSR, blind spot detection system BSD, driver fatigue detection system DFM and rear Collision warning system RCW. 14. Sen Guoke (formerly Shenzhen Guoke Microelectronics) Sen Guoke (formerly Shenzhen Guoke Microelectronics) also released a self-developed high-performance ADAS chip SGKS6802X in December last year. According to reports, the product has been officially shipped. The SGKS6802X is equipped with a dual-core ARM Cortex A7 processor, a high-speed dual-core 8-thread GPU, and a 2D accelerated GPU. With a 40-nm process, the chip typically consumes 1500mW, the entire system consumes 1800mw (including DDR), and supports a maximum of four-channel code processing capability. The integer computing capability is 7200MIPS. +3200MIPS, 25.6GFLOPS for half-precision floating point, 6.4GFLOPS for single-precision floating-point, and supports ADAS such as LDW, FCW, PCW, TSR, NV, TFAH, ZCD, CTA, BSD, DFM, and RCW. Advanced Computational Driving Requirements. Micron is developing GDDR6 to meet the requirements of autopilot technology for automotive memory capacity and bandwidth. Micron is also developing a non-volatile memory with a PCIe interface to meet 5G communications, high-definition maps and car black boxes for nonvolatile storage. demand. Essentially, both lidar and millimeter-wave radar use echo imaging to construct detected objects. This is equivalent to the fact that humans use their eyes to detect and bats rely on ultrasonic detection. However, the laser radar will be more susceptible to natural light or thermal radiation. When the natural light is intense or in the area of ​​radiation, the laser radar will be greatly reduced and the cost of the laser radar will be high, and the process level will be relatively high. In the case of millimeter-wave radar, although the anti-interference ability is strong, the distance and accuracy are really flawed. In the driving environment, the impact on the millimeter wave under the multi-band environment is great. Millimeter waves are also extremely limited in their ability to detect far away. Lidar: Millimeter wave radar: Autopilot technology is a trend in the future, but in the process of development, it is inevitable or sacrificed, such as the recent auto-car accident, so we need to look at it rationally. Earphone Headphone Earphone Headphone,Best Earbuds Under 1500,Boult Bluetooth Earphones,Apple Bluetooth Headphones Pogo Technology International Ltd , http://www.wisesir.net
March 24, 2023