Why Automobiles are Like Data Centers on Wheels, and Should Be Tested Accordingly
Given everything that is going inside modern automobiles, such as in-car entertainment, driver-assistance technologies (ADAS), and electric-vehicle battery management systems, it’s not too much of a stretch to call them data centers on wheels. Advanced sensors, such as cameras, LiDAR, radar, and sonar, can generate up to 8 Gbits/sec of uncompressed data per sensor.
In addition, cloud-based services for diagnostics, mapping, and software updates add to the need for continuous, high-bandwidth communication with the cloud. This adds to the data load in cars, especially as data sharing grows in importance for improving autonomous driving algorithms, enhancing personalization features, and ensuring vehicle safety.
IEEE standards such as 802.3bw (100 Mbits/sec) and 802.3bp (1000 Mbits/sec), and recently 802.3ch (Multigig-T1), accommodate these high-bandwidth applications. However, such high speeds can be overkill for lower-bandwidth applications, such as sensors, actuators for infotainment, lighting, and powertrain.
The more recent 802.3cg (10 Mbits/sec) standard accommodates these applications while constraining the system to a single technology (Ethernet), which conserves costs in terms of both system design and bill-of-materials. Updates to IEEE 802.3ch add physical layer specifications and management parameters for 2.5 Gbits/sec, 5 Gbits/sec, and 10 Gbits/sec operation over distances up to 15 meters on a single balanced pair of conductors. IEEE 802.3ch also allows for high-speed data transmission in a more-compact form factor, which is crucial for automotive applications where space and weight are major considerations.
Why Test?
At such high data rates, even the smallest imperfections in the cables or connectors can result in signal degradation, data loss, and interference. This is especially true for mission-critical applications such as automotive driving, which has implications that go beyond performance problems to safety concerns. If the data from LiDAR cameras is corrupted or delayed due to poor-quality cables or connectors, the vehicle’s ability to respond in real time could be degraded, potentially endangering the passengers.
Bit-error rate (BER) testing is a commonly used method of validating the performance of cables and connectors. A BER test is performed by transmitting a known sequence of bits and comparing the received bits to the original sequence. The error rate is calculated by dividing the number of errored bits by the total number of bits transmitted. For example, if a million bits were transmitted and three errors were detected, the bit-error rate is 3 out of a million, which is expressed as 3 divided by 10 to the -9 power: 3x10ˆ(-9) or simply as 3e-9. This number expresses the probability of any particular bit received in error for that tested system.
Three out of a million may not sound too bad, but even a single errored bit can trigger greater data loss when upper protocol layers are involved. Consider that BER measurements are typically performed at the physical or data link layers of a communication system. On an Ethernet link, a bit error received at the physical layer can result in Layer 2 MAC frame FCS (Frame Check Sequence) errors, which cause the entire MAC frame to be discarded. These lost frames can cause higher layer protocols, such as TCP, to request a data retransmission, which increases bandwidth congestion, reducing overall performance in the system.
The other issue that crops up with performing a BER test on an Ethernet link is the Ethernet frame itself. The protocol inserts non-random bits that are not part of the test sequence, specifically the inter-frame gap, the preamble, the start frame delimiter, and the frame check sequence. If higher-layer protocols are involved, more bits that are not part of the random pattern will be introduced, such as the MAC layer, VLAN frame, or IP header. The generator/analyzer must parse out the protocol bits that are not part of the random sequence to accurately calculate the BER.
Another important metric for testing data integrity, especially over Ethernet, is called frame error ratio. The FER measurement is a similar concept as BER, except that the unit of measurement is changed from a bit to an Ethernet MAC frame. Specifically, the FER measurement is calculated as the number of errored Ethernet frames received divided by the total number of frames received. The frame is considered errored if it has a bad frame check sequence value, which is a 32-bit CRC value calculated over the entire Ethernet frame.
It is important to note that if a bit error occurs anywhere within a MAC frame, this will result in the entire frame having a bad FCS value, which in turn causes an Ethernet network device to drop the entire frame. Therefore, a single bit error can result in 12,144 data bits being lost in the cause of a 1518-byte Ethernet frame, something that is not considered in a pure BER measurement scenario.
What to Look For in a Test System
There are several functions required to test the modern automotive in-vehicle network ecosystem.
You need an endpoint traffic generator and analyzer, an inline analyzer for PCS and packet capture and visibility, and an inline network impairment emulator for limiting bandwidth, inserting delay, and adding impairments to verify applications are sufficiently robust to handle real-world automotive network connections.
You also need a system that supports a full range of interface rates, including Base-T, Base-T1, Base-RH, and the new multigigabit automotive Base-T1 rates of 2.5G/5G/10GBase-T1 (IEEE 802.3ch) and 10Base-T1S (IEEE 802.3cg).
Aukua is at the forefront of supporting the latest automotive Ethernet standards, ensuring robust and reliable performance for next-generation automotive networking applications. With the rapid evolution of in-vehicle networking, Aukua’s solutions fully align with IEEE and OPEN Alliance standards, making them an essential tool for automotive engineers and manufacturers.