If you can't get to sleep on a Friday or a Saturday night, I highly recommend you go to the "links" at my "link page," linked below. For what I enjoy, these are incredibly good links.
I could find something to post from each of the links below if I wanted, but there's a link that investors need to visit, especially those investors upset with their portfolios right now due to "the war."
Or go directly to the story: link here. The link takes you to a very, very long blog.
This is the lede:
Exporters to consider:
To extended family members: I hold shares in four of the five companies listed. Whoo-hoo! And I've held some of them for decades! Most (all?) of them pay dividends.
Does BRK / Warren Buffett own shares in any of these five companies? No.
I absolutely cannot believe I ended up talking with Russ Bodnyk for about an hour today. Had I known, I would have spent three hours with him, but my wife needed to "move on."
Note: this is for my personal edification and to some members of the extended family. This is not for the general reader of the blog.
I had the highly unexpected opportunity to spend an hour with Russ Bodnyk today. Holy mackerel.
For the date-time stamp.
Russ Bodnyk. Interview. Intellibus. What's important about this video for AI novices: the jargon. The interview lasts almost three hours. Parameters. See also this post. See also this post.
Billions of parameters:
7 -- generally all he needs (llama 1);
13 -- occasionally uses;
70 -- can access if necessary;
120 --
170 - 500 billion —
150 trillion -- about what the human mind is estimated to have.
The current LLMs are roughly equivalent to the mouse brain.
If current LLMs have upwards of 150 billion parameters, and are about equivalent to the mouse brain (synapses), when is it estimated that LLMs will come close to matching the human brain?
Bodnyk: however, thinks it's more than just counting synapses and parameters.
Genetics: I brought this up. The DNA - RNA dichotomy. DNA is boring; nothing there, in the big scheme of things, compared with RNA. RNA is where all the work is done. DNA is simply storage (DRAM).
LLMs doubling time: 16 months (notice it beats Moore's Law of 24 months.
Tag: Yann LeCun AMI Kurzweil
Ilya Sutskever: co-founder of OpenAI, the company behind ChatGPT.
Geoffrey Hinton: University of Toronot; Sutskever's
He is the Jacob T. Schwartz Professor of Computer Science at the Courant Institute of Mathematical Sciences at New York University. He served as Chief AI Scientist at Meta Platforms before leaving to work on his own startup company.
AI prompt: Russ Bodnyk often states that AI will reach its first major milestone fourteen years from now. Why would he say fourteen years as opposed to fifteen years or ten years or twenty years, but 14 years? What's so "special" about 14 years from now?
Reply:
Book: Why Machines Learn: The Elegant Math Behind Modern AI, Anil Ananthaswamy, c. 2024/2025.
This
was first posted back in 2019 / 2021. It took a little time to find. I
don't want to lose it again, so I'm re-posting it for the archives and
tagging it.