Pages

Saturday, January 27, 2024

Time For A YouTube Interlude -- January 27, 2024

******************************
The Book Page

Checked out from the library this past week; enthralled.

Now, need my own copy. It can't be read in two weeks.

Page after page of history so well told. 

Oh, the book?

The Gulf: The Making of an American Sea, Jack E. Davis, c. 2017.

It just seems if you live along / on the coast, you need this book on tu estanteria.

**************************
Flashback
1990 - 1991

*********************
20 

David Lynch.

An MRO Chimney Butte Well Looking At A Second Jump In Production -- January 27, 2024

Locator: 46670B.

From this post.

January 27, 2024:

  • 20233, 947, MRO, Marlin 44-12H, Chimney Butte, t6/11; cum 342K 11/23; was off line for about one year; now back on line, 11/23; looks like it might be a second jump in production.
PoolDateDaysBBLS OilRunsBBLS WaterMCF ProdMCF SoldVent/Flare
BAKKEN11-202329437742441396945164039367
BAKKEN10-20230000000
BAKKEN9-20230000000
BAKKEN8-20230000000
BAKKEN7-20230000000
BAKKEN6-20230000000
BAKKEN5-20230000000
BAKKEN4-20230000000
BAKKEN3-20230000000
BAKKEN2-20230000000
BAKKEN1-20230000000
BAKKEN12-20220000000
BAKKEN11-202228172220301808367617091654
BAKKEN10-20222819211817209140673418325
BAKKEN9-20222413491345132628922210437
BAKKEN8-20222414201486157230362513268
BAKKEN7-2022311744172219803869349632
BAKKEN6-20223019062003217341973096760

Much earlier, the first jump in production:

BAKKEN9-201830457847025078689951341314
BAKKEN8-201828391838654057590128682661
BAKKEN7-20183147894853527474195998946
BAKKEN6-2018305490528162937437682091
BAKKEN5-20183165566861697665034964935
BAKKEN4-20183076607558785562944332969
BAKKEN3-201831109321095511832971062972059
BAKKEN2-20182140843945386228817431657
BAKKEN1-201822132112701466122010122
BAKKEN12-2017315827575962995144430235
BAKKEN11-20173066646829681551923984317
BAKKEN10-20173172877130753456834102618
BAKKEN9-201730645764367755791836373412
BAKKEN8-201718354334365066432322751575
BAKKEN7-20177789012394085
BAKKEN6-2017928728231633915291
BAKKEN5-201700260000
BAKKEN4-20171241841319556840550
BAKKEN3-201731115611535231602125440

Is Petro-Hunt Getting Ready To Complete Several DUCs In Little Knife? January 27, 2024

Locator: 46669PETROHUNT.

Updates

July 18, 2024: all three wells of interest now back on line and doing very, very well; production updates:

  • 35809, loc/NC-->loc/A, Petro-Hunt, Sabrosky 144-97-7C-6-3H, Little Knife, t6/21; cum 181K 5/24;
  • 35808, loc/NC-->loc/A, Petro-Hunt, Hagen 144-98-12D-1-3H, Little Knife, t6/21; cum 189K 5/24;
  • 35807, conf-->drl/A, Petro-Hunt, Hagen 144-98-12D-1-2H, Little Knife, t6/21; cum 177K 5/24;

Original Post

From this post from June 20, 2021:

Three wells of interest:

  • 35809, loc/NC, Petro-Hunt, Sabrosky 144-97-7C-6-3H, Little Knife,
  • 35808, loc/NC, Petro-Hunt, Hagen 144-98-12D-1-3H, Little Knife,
  • 35807, conf, Petro-Hunt, Hagen 144-98-12D-1-2H, Little Knife,

These three wells have just come off line, suggesting the DUCs on the corresponding pads are about to be completed (or are completed and paperwork yet to catch up). 

Maps:



IEEFA Assessment Of The Bakken -- November, 2019 -- Posted January 27, 2024

Locator: 46668B.

Newer Assessments

Lynn Helms, et al, advancements and operational insights in the Bakken shale: submitted November 16, 2023, reviewed November 22, 2023 and published January 17, 2024: link here.

Novi Labs, through May, 2023: link here.

Original Post

Link here.

IEEFA assessment of the Bakken, dated November, 2021. A quick read of this report suggests the Bakken will be pretty much globally insignificant by 2025, or even earlier.

I can't recall if I've linked this report or posted it before. Whatever.

An "Oil Drum" flashback, first posted in 2013, and, then, again, in 2018, and now, again, in 2023, ten years from the original post:

Production Update On A Spectacular Grayson Mill State Well -- How Do You Like The Bakken Now? January 27, 2024

Locator: 46667B.

From this post.

The well:

  • 37630, 4,419, Grayson Mill, State 35-2 XW 1H, Stony Creek, t--; cum no production data; sited at SESW 25-155-100; 24,440 bbls over 8 days extrapolates to 91,650 bbls crude oil over 30 days: cum 223K 11/23;
PoolDateDaysBBLS OilRunsBBLS WaterMCF ProdMCF SoldVent/Flare
BAKKEN11-202330174191738218948424951444228031
BAKKEN10-20231779437967106131215158176301
BAKKEN9-20233013202132361600919868157564035
BAKKEN8-2023311376613742188361777517372326
BAKKEN7-20233120625207132308038919365842230
BAKKEN6-20233028794288142893052891438718902
BAKKEN5-202331434234356337201631244354619429
BAKKEN4-202330478034773446750627313759324920
BAKKEN3-202330646476457562141806626309917317
BAKKEN2-2023824440242162012631429753123750
BAKKEN1-20230000000

 The well of interest:

  • 19216, 3,442, Grayson Mill, Sate 36-1 3H, t6/12; cum 333K 1/23; 3,792 bbls over nine days extrapolates to 12,640 bbls over 30 days; compare to 1,000 bbls over 30 days prior to neighboring fracks; cum 438K 11/23;
PoolDateDaysBBLS OilRunsBBLS WaterMCF ProdMCF SoldVent/Flare
BAKKEN11-2023307241734016338364422581510507
BAKKEN10-2023319565929219405402371622324014
BAKKEN9-202330119081184223053417162112220594
BAKKEN8-202331130981302925030355601888516675
BAKKEN7-20233114623148492856333764823525529
BAKKEN6-20233014467143843225227887524022647
BAKKEN5-20233112843128513753820662707213590
BAKKEN4-2023164880497522906778331454638
BAKKEN3-202331869586165750114976187613100
BAKKEN2-20232581678694668481260658812014
BAKKEN1-20239379228913995247254894204
BAKKEN12-20220000000
BAKKEN11-20220000000
BAKKEN10-202200430000
BAKKEN9-2022002460000
BAKKEN8-2022124427603343153150
BAKKEN7-2022301023102410827357350
BAKKEN6-2022301089110411465865860
  •  18307, 3,236, Grayson Mill, Sate 36-1 1-H, t1/10; cum 408K 1/23; cum 526K 11/23;
PoolDateDaysBBLS OilRunsBBLS WaterMCF ProdMCF SoldVent/Flare
BAKKEN11-2023309835984319965485081648531997
BAKKEN10-2023227043703314923230431103111949
BAKKEN9-202330928593121984127994222005685
BAKKEN8-2023311059810567221672521024639462
BAKKEN7-20233112885129292781726468248811516
BAKKEN6-20233016347163213468729594245474981
BAKKEN5-20233116284162474583026605183538189
BAKKEN4-2023165714571027161883652953510
BAKKEN3-202324837085376624513280103892851
BAKKEN2-202328216922234812328933164794725061
BAKKEN1-2023386308528243402380
BAKKEN12-20220000000
BAKKEN11-20220000000
BAKKEN10-20220000000
BAKKEN9-20220000000
BAKKEN8-20221045687047844436678
BAKKEN7-20222915641710115314661127339
BAKKEN6-20222911971137119295286389

Family Sports Note -- Nothing About The Bakken -- January 27, 2024

Locator: 46665SPORTS.

The highlight of our world -- with regard to sports -- over the weekend. 

Yesterday in the cold and the rain, our granddaughter Olivia scored the only goal and the winning goal in the girl's soccer game.

Olivia is one of the team's captains, and the only captain returning from last year. The other two captains were seniors and they have moved on. Olivia was the first junior in the school's history to be named a captain -- at least as far as anyone can recall. 

The team won the Texas state soccer championship last year -- and Olivia was a big part of that accomplishment.

So, fast forward to yesterday in the cold and the rain, 0 - 0 at the end of regulation time. The only really, really big moment in regulation time was Olivia stopping an opponent's goal attempt. It was huge and quite remarkable. The goalie was "taken out" and the goal unguarded just as the opposing team took a shot -- and with an incredible run and dive, Olivia stopped the ball at the goal line.

In the 5-shot over time, it was 2 - 1 going into the fourth shot. The opposing team member kicked and miss. Olivia, pure serendipity, was our team's fourth of five to shoot. She scored; that ended the shooting at four attempts and our team won 1 - 0. 

Whoo-hoo! 

So, one incredible save and the winning goal. The team's captain with the only goal and the winning goal. How fitting.

But, wow it was wet, windy and cold. 

History OF US LNG Export Industry -- January 27, 2024

Locator: 46664LNG.
Locator: 46664DEFLATION.

This is an incredible piece of work by Stephen Stapczynski over at twitter / X.

Two milestones:

  • the Bakken revolution
  • Aubrey McClendon

*****************************
Inflation Watch

Again, at the "most expensive" of the six+ grocery stores in our local area.

Great marketing and loyalty rewards but at the end of the dau it's all about the supply chain post-pandemic. 


 I would argue that the price of Tide detergent is fairly "price inelastic."

The government's market basket when pricing Tide: $7.49.

The consumer's market basket when pricing Tide: $3.99.

Tell me again US inflation is running at 9%. This is classic deflation. Not disinflation but deflation.

One of the few places we won't see much (if any) deflation in the US in 2024: restaurant meals; airline travel; shares of NVID.

******************************
A Second Language

The most commonly spoke language other than Spanish by state.

This is why I'm studying Spanish on Duolingo.

Energy Demand -- January 27, 2024

Locator: 46663TECH.

Link here.

The WSJ:  December 15, 2023.

Some experts project that global electricity consumption for AI systems could soon require adding the equivalent of a small country’s worth of power generation to our planet. That demand comes as the world is trying to electrify as much as possible and decarbonize how that power is generated in the face of climate change.

Since 2010, power consumption for data centers has remained nearly flat, as a proportion of global electricity production, at about 1% of that figure, according to the International Energy Agency. But the rapid adoption of AI could represent a sea change in how much electricity is required to run the internet—specifically, the data centers that comprise the cloud, and make possible all the digital services we rely on.

This means the AI industry is poised to run the equivalent of a planetary-scale experiment, according to experts both outside and inside the industry. This has some of them wringing their hands, while energy suppliers are practically salivating at the expected increase in demand.
And then this:
Constellation Energy, which has already agreed to sell Microsoft nuclear power for its data centers, projects that AI’s demand for power in the U.S. could be five to six times the total amount needed in the future to charge America’s electric vehicles.

Alex de Vries, a researcher at the School of Business and Economics at the Vrije Universiteit Amsterdam, projected in October that, based on current and future sales of microchips built by Nvidia specifically for AI, global power usage for AI systems could ratchet up to 15 gigawatts of continuous demand. Nvidia currently has more than 90% of the market share for AI-specific chips in data centers, making its output a proxy for power use by the industry as a whole.

That’s about the power consumption of the Netherlands, and would require the entire output of about 15 average-size nuclear power plants.

Story Of The Day? January 27, 2024

Locator: 46661TECH.

Link here

From Forbes, September 11, 2023:

To understand the significance of Dojo, one needs to examine the existing milieu of AI processors and supercomputing. Conventional supercomputers, typified by NVIDIA's A100 GPUs, IBM's Summit, or HPE’s Cray Exascale, have been vital in scientific research, complex simulations, and big data analytics. However, these systems are primarily designed for a broad array of tasks rather than optimized for a singular purpose like the real-world data-driven AI computer vision that Tesla is designing Dojo for.

Tesla’s Dojo promises to revolutionize the AI processing landscape by focusing solely on improving the company's FSD capabilities. With this vertical integration, Tesla aims to construct an ecosystem that encompasses hardware, data, and practical application—a trifecta that could usher in a new era of supercomputing, explicitly designed for real-world data processing.

Historically, Tesla relied on NVIDIA's GPU clusters to train its neural networks for autopilot systems. Despite the lack of clarity in performance metrics such as single-precision or double-precision floating-point calculations, Tesla claimed to operate a computing cluster that stood as the "fifth-largest supercomputer in the world" as of 2021. Details are hard to come by but various commentators have put Tesla’s hoard of NVIDIA A100 GPUs at over 10,000 units, which by any stretch puts Tesla as having one of the largest training systems globally, and the company has been at this for at least 2 years now.

From wiki

Tesla Dojo is a supercomputer designed and built by Tesla for computer vision video processing and recognition.
It will be used for training Tesla's machine learning models to improve its Full Self-Driving advanced driver-assistance system. According to Tesla, it went into production in July 2023.
Dojo's goal is to efficiently process millions of terabytes of video data captured from real-life driving situations from Tesla's 4+ million cars.
This goal led to a considerably different architecture than conventional supercomputer designs.
Tesla operates several massively parallel computing clusters for developing its Autopilot advanced driver assistance system.
Its primary unnamed cluster using 5,760 Nvidia A100 graphics processing units (GPUs) was touted by Andrej Karpathy in 2021 at the fourth International Joint Conference on Computer Vision and Pattern Recognition (CCVPR 2021) to be "roughly the number five supercomputer in the world" at approximately 81.6 petaflops, based on scaling the performance of the Nvidia Selene supercomputer, which uses similar components.
However, the performance of the primary Tesla GPU cluster has been disputed, as it was not clear if this was measured using single-precision or double-precision floating point numbers (FP32 or FP64).
Tesla also operates a second 4,032 GPU cluster for training and a third 1,752 GPU cluster for automatic labeling of objects.
The primary unnamed Tesla GPU cluster has been used for processing one million video clips, each ten seconds long, taken from Tesla Autopilot cameras operating in Tesla cars in the real world, running at 36 frames per second. Collectively, these video clips contained six billion object labels, with depth and velocity data; the total size of the data set was 1.5 petabytes. This data set was used for training a neural network intended to help Autopilot computers in Tesla cars understand roads.
By August 2022, Tesla had upgraded the primary GPU cluster to 7,360 GPUs.
Supercomputer, 500, wiki.

Datacenterdynamics, 80 petaflops.

Bottom line: this is the kind of investment every EV automobile manufacturer will have to make -- and there are no less than a dozen EV manufacturers -- or buy the information from Elon Musk.

Ticker:


Disclaimer: this is not an investment site. Do not make any investment, financial, job, career, travel, or relationship decisions based on what you read here or think you may have read here. 

All my posts are done quickly: there will be content and typographical errors. If anything on any of my posts is important to you, go to the source. If/when I find typographical / content errors, I will correct them. 

Again, all my posts are done quickly. There will be typographical and content errors in all my posts. If any of my posts are important to you, go to the source.

Reminder: I am inappropriately exuberant about the US economy and the US market.