Thursday, January 1, 2026

Tesla Jargon -- Body-In-White -- BIW -- January 1, 2026

Locator49780TESLA.

In an earlier post the Tesla term "body-in-white" (BIW) was mentioned. 

For a nice summary, google: "tesla "body-in-white" concept." 

If you do that, it will lead you to this:  

Tesla's "Body-in-White" (BIW) concept involves creating the car's bare metal skeleton, focusing heavily on mega-casting on (large single-piece castings for front/rear structures) and innovative designs for strength, cost, and simplified assembly, differing from traditional BIWs by integrating structural battery packs and reducing part count for efficiency, as seen in the Model Y and Cybertruck developments. 

Tesla is not alone:

Many major automakers use the Body-in-White (BIW) process, including Ford, GM, BMW, Mercedes-Benz, Volkswagen, Toyota, Hyundai, Volvo, Polestar, Nissan, Kia, Jaguar, Lucid, Rivian, Nio, Xped, nnd Porsche, with many integrating advanced techniques like mega-casting (Tesla's signature) for their electric vehicles (EVs) to reduce weight and improve space, while traditional manufacturers like Magna also supply BIW solutions for various OEMs. 

For a full discussion, link here to Pall Kornmayer. This is the lede:

Why is it called "body-in-white?"

It's called "Body in White" (BIW) because it refers to the bare, welded metal car body before painting, historically receiving a white primer coat for rust protection, though the term stuck even as primers became grey or other colors, signifying the unfinished shell stage before adding trim, engine, and other components. 
The "white" part comes from old practices where the raw body was painted white to allow customization later, or from dipping it in white primer, while "body" refers to the assembled structure.  
Key Meanings of "Body in White": 
  • Manufacturing Stage: It's the completed shell of sheet metal parts (roof, sides, etc.) welded together, forming the vehicle's structure, but before paint, interior, or mechanicals. 
  • Historical Origin: In early car manufacturing, bodies were built separately and often painted white (or a primer that looked white/light) to offer customers color choices later or to protect the bare metal. 
  • Modern Context: Today, it still describes this fundamental stage, even though the primer is often grey or the body is treated with other coatings before final painting. 

Why it's Important: 

  • Structural Integrity: The BIW provides the strength, stiffness, and safety (crashworthiness) for the entire vehicle. 
  • Foundation: It's the base to which all other parts, like the engine, chassis, interior, and exterior trims, are added.

Tech Jargon -- EDA -- January 1, 2026

Locator49779TECH.

See this post, the Google empire.

EDA: the last step before chips go into mass production. Duopoly: Cadence and Synopsys.  

Chip manufacturing EDA -- link to YouTube (https://www.youtube.com/watch?v=4QpQjZuAzBw) (Electronic Design Automation) refers to:

  • specialized software, hardware, and services 
  • that automate the complex process of 
  • designing, verifying, and manufacturing integrated circuits (chips), 
  • enabling engineers to manage billions of transistors and features, 
  • ensuring functionality, performance, and power efficiency before physical production.

Major players like Cadence, Synopsys, and Siemens EDA -- link to Forbes (https://www.forbes.com/sites/karlfreund/2025/04/29/eda-vendors-help-intel-get-the-usa-back-into-chip-manufacturing/)) dominate the market. 

EDA tools cover the entire chip lifecycle, from initial concept and logical design to physical layout, simulation, and testing, with AI now enhancing productivity and shift-left development.

Shift-level development: as product development moves from early state to completion, the arrow is generally showing as moving left-to-right. As shift-level development suggests, status of products are tested earlier -- a step back to the left -- rather than progressing to the end of the process (far right) and tested there. 

Shay: Five Companies Best Positioned For The Physical AI Economy In 2026 -- Shay Boloor --January 1, 2026

Locator49778TECH.

Six months ago I was ignoring robots and drones. 

Link here

AI Prompt: six months ago I was skeptical of robots and drones. Now, I'm a believer. With regard to robots, I would think it's hard to compare robot manufacturers. There are so many different kinds of robots, some very simple, some very complex. I think some folk still argue about the definition of a "robot." With all those caveats in place, which company is most advanced when it comes to -- not robots per se -- but the USE / EMPLOYMENT of robots in their core business. Which company is making the best use of robots; will that lead endure? Thoughts on robots in general? 

The short answer:

  • Amazon is the clear global leader in the use of robots as an integrated business system.
  • Tesla is the most ambitious second, but with higher execution risk.
  • Foxconn is the most underappreciated.
  • Ocado is the most elegant niche example.

Yes, Amazon's lead is likely to endure, though not unchallenged

This is most interesting: robots work best when the product is designed for robots -- I would re-phrase that, robots work best when the product is designed with robots in mind. Amazo has this figured out. Tesla, not so much.  

Those excelling in the robotic arena:

  • Amazon: the gold standard for robot employment.
  • Tesla: the boldest bet on robot-native manufacturing --
    • extreme automation in 
      • gigafactories
      • battery production
      • body-in-white.
    • but Tesla robotics has its weaknesses
  • Fox onn: the quite industrial robot superpower
    • operates some of the most robot-dense factories on earth
    • Foxconn's robots are not flashy; they are relentlessly pragmatic
      • that mindset scales.
  • Ocado: the most beautiful example of robotic coordination
    • Ocado: UK grocery fulfillment)Ocado shows what happens when robots are designed as a collective system, not individuals. 
    • their weakness: scale and capital intensity
    • something for BRK to think about?
  • Others worth mentioning (briefly):
    • JD.com
    • Toyota
    • Walmart

A crucial distinction most people miss:

  • there are three different "robotics games":
    • robots as tools (most factories)
    • robots as systems (Amazon, Ocado)
    • robots as labor substitutes (humanoids, still early)

Most companies are stuck at #1.
The winners live at #2.
#3 is coming -- but slower than the hyper suggests.

 

The Google Empire -- January 1, 2026

Locator49777TECH.

Updates

January 2, 2026: from below -- 

A key insight most people miss:

  • hyperscalers to not fear competition from each other. 
  • They fear capacity denial at choke points.
 From Shay today, link here

Original Post  

Before we get started: another reason I like ChatGPT -- it points me to articles I otherwise would have missed. In this case, "EDA Vendors," Forbes, earlier this year, this is how fast things are moving, April 29, 2025. Link here

This page needs to be formatted, proofread; there will be a lot of content and typographical errors but wanted to get it posted immediately for various reasons.

Note: this was all typed out, one letter / one character at a time. There was absolutely no "cut and paste" on this page (except the one URL).  

Below the fold way down below is an extensive listing and explanation of the Google support system. 

1. I sent this exact list to ChatGPT to see how/if AWS differs. The answer was interesting.

2. I then asked what suppliers are the true chokepoints vs interchangeable.

The true choke points:

  • ASML (EUV lithogaphy)
  • TSMC (leading-edge foundry)
  • Nvidia (for now) 
  • Synopsys + Cadence (EDA duopoly)
  • ARM (ISA licensing)
  • Soft choke points:
  • HBM memory suppliers (SK Hynix, Micron, Samsung)
  • Advanced packaging (CoWoS, Amkor, ASE)
  • Optical components (Lumentum, Coherent)
  • High-speed networking silicon (Broadcom, Marvell)

Largely interchangeable (relative speaking)

  • Storage (Western Digital, Seagate)
  • Data center infrastructure (Vertiv, Schneider, Eaton)
  • Foundry-adjacent services

A key insight most people miss:

  • hyperscalers to not fear competition from each other. 
  • They fear capacity denial at choke points.

That's why

  • everyone courts TSMC
  • everyone invests in custom silicon
  • everyone hoards HBM
  • everyone diversifies optics suppliers even at high cost 

The real power map looks like this:

  • ASML --> TSMC  --> EDA --> ARM --> Nvidia/HBM  --> Hyperscalers  -->Everyone else

Tech jargon: EDA, link here.  

Why this matters (especially for investors and policy makers)

  • pricing power lives at choke points
  • geopolitical risk lives at choke points
  • innovation bottlenecks live at choke points
  • most "AI arms race" headlines are downstream noise

This was my next question: a completely different question because so much of the above relates to the AI LDCs, LLMs, cloud development and management, WHEREAS Apple as a consumer electronics company is completely different. But you mentioned the importance of CUSTOM SILICON. Apple seems to have been the leader in custom silicon. Any thoughts on that?

The short answer: that's a great pivot -- and it actually clarifies the entire custom-silicon story rather than changing it. Apple isn't an exception fo the rule: Apple is the proof case.

  • Apple understood custom silicon before "AI infrastructure existed.
  • Apple's custom silicon succeeded where others failed
    • vertical integration
    • volume certainty
    • long-term patience 

Apple has the same choke points: same list, different leverage. Apple still depends on:

  • TSMC (absolutely)
  • ASML (indirectly)
  • ARM (ISA)
  • Cadence / Synopsy (EDA)

Apple does not escape the chokepoints; Apple addresses them differently

  • consume them earlier;
  • lock them longer
  • integrate them more deeply

Apple is TSMC's most strategically important customer.

Than a long discussion on Apple vs other hyperscalers with regard to AI. Bottom line, the irony, hyperscalers are becoming more like Apple.

Then a long discussion on why Apple's lead is durable. And the reasons are very, very difficult for others to copy what Apple does.

Then the bottom line: another long discussion but at the end:

  • Apple is not different from the custom silicon story -- Apple is the clearest version of it. 

*********************************
The Alphabet Empire

From Shay: link here. 

The infrastructure between the Alphabet empire:


AI Chips

  • Broadcom helps Google design custom TPUs so it can lower AI chip costs and avoid Nvidia pricing.
  • TSM: the only foundry currently capable of producing Google's leading-edge TPSs at scale with acceptable yields.
  • ARM licenses the CPU architecture Google uses alongside TPUs for AI inference and control.
  • Cadence Designs systems sells the software Google uses to design each new generation of AI chips.
  • Synopsys provides chip testing / IP so Google can ship complex TPUs without failures.
  • Amkor Technology packages TPUs + memory together so Google can run them at data-center scale.

AI Networking

  • Astera Labs supports high-speed rack-level connectivitiy as Google scales TPU pods.
  • Marvell Technology supplies custom networking chips inside Google's AI data hardware.
  • Arista Networks supplies switches that route traffic inside Google's AI clusters.
  • Ciena Corp moves data between Google's data centers over long distances

AI Utility

  • Cipher Mining supplies energy-backed sites supporting large AI workloads.
  • Terawulf operates power-dense infrastructure where large-scale compute can be deployed.

AI Memory

  • Micron adds DRAM + HBM supply as Google expands AI inference.
  • Western Digital stores the massive datasets Google uses to train AI models.

AI Optics

  • Lumentum Holdings supplies optical components used inside Google's AI data centers.
  • Coheret Corp provides lasers needed for high-speed optical data transmission

AI Power

  • Vertiv Holdings provides cooling infrastructure that keeps Googl's AI hardware online.

Day 1 -- From The X-Files -- January 1, 2026

Locator49776ARCHIVES.

January 1, 2025: As I've said before,  I would love to have Christmas Day and New ear's Day fall on Thursday every year. Yesterday was really relaxing, today will be even more so. Then tomorrow we get the pleasure of tip-toeing back into the swing of things, before the weekend. Finally, Monday, we may be ready to start the new year. Sophia has until next Wednesday to get back to school. Whoo-hoo!

Nothing to do but wait for the college football games today.

NFL: biggest story in last 24 hours -- Netflix set a new record on Christmas Day with 25.7 million viewers for its livestream of the Minnesota Vikings - Detroit Lions matchup, making it the most-streatmed NFL game in history -- Variety via Evan. Thank you. 

Flu: I'll post the links and graphics later but for now, the CDC is reporting that "seasonal fue" is now surging. Apparently Texas has been hard hit but we haven't noticed a thing. Yet. 

Nvidia: making so much money; generating so much cash flow, can't spend it all; "... getting creative as options to use its cash flood narrow..." -- The WSJ, link here

For the record: the S&P 500 gained 16 percent this past year (2025). AI helped a lot, but dependence on artificial intelligence remains a risk for 2026. Now, who would say that? Yup, The New York Times. Link here

Investing: that's about all I'm following on the x-files now -- 

  • Export licenses: apparently all the "routine" export licenses for China have been renewed;
  • Tariffs: Trump pulling back, fast and furious; 
  • Beth: noted how prices will change with reapproval of exports of Nvidia's H200 GPU to China; the upcoming ramp of Google's TPU v7; and, Amazon's Trainiume. Link here
  • Beth: it's no bubble. Link here. xAI has bought a third building in Memphis to expand its AI training capacity to nearly 2GW, with the company aiming to sale its Colossus cluster to at least one million GPUs. An observation and a comment:
    • 2GW seems to be the AI standard now; anything less is ho-hum; has to trend toward 5GW before it gets a headline;
    • I wonder where xAI is getting its million GPUs?
  • Beth: margins. Soaring memory prices. The big three (Samsung, SK Hynix and Micron) will easily see gross margins from 63 - 67%, above TSMC's 60% guide. Link here.
  • Beth: Citi estimates that Broadcom's custom AI accelerator revenue will double again to $100 billion in 2027, with Google contributing half at $50.1 billion, along with $20.2 billion in revenue from its deal with OpenAI. Link here
    • Broadcom's custom AI accelerator revenue is projected to be as high as $50.5 billion in fiscal 2026, up 3.5x from 2025, with Google accounting for roughly half of that total at $25.8 billion, followed by Anthropic at $20.9 billion. 
  • Beth: BofA predicts semiconductor sales will rise 30% y/y to surpass the $1 trillion milestone in 2026, ahead of industry expectations for 26% growth to $975 billion in sales, on strong AI and memory demand -- added January 2, 2025. 

Refunds: the IRS generally starts accepting and processing returns in late January, and typically begins sending out most refunds within 21 days of acceptance.

Bubble? What Bubble? January 1, 2026

Locator49775BUBBLE.