Wednesday assorted links

1. Scenario for an independent Greenland.

2. Gender gaps in the Federal Reserve System.

3. Zvi on congestion pricing in NYC.

4. A short (pronunciation) saga from Kearny, NJ, the place of my birth.

5. Milei is artificially slowing the devaluation of the peso (FT).

6. Henry Farrell on America’s plan to control global AI (hint: it is not going to work).

The post Wednesday assorted links appeared first on Marginal REVOLUTION.

       

Comments

 

YoY Measures of Inflation: Services, Goods and Shelter

Here are a few measures of inflation:

The first graph is the one Fed Chair Powell had mentioned when services less rent of shelter was up around 8% year-over-year.  This declined, but is still elevated, and is now up 4.0% YoY.

Services ex-ShelterClick on graph for larger image.

This graph shows the YoY price change for Services and Services less rent of shelter through December 2024.

Services were up 4.4% YoY as of December 2024, down from 4.5% YoY in November.

Services less rent of shelter was up 4.0% YoY in December, down from 4.1% YoY in November

Goods CPIThe second graph shows that goods prices started to increase year-over-year (YoY) in 2020 and accelerated in 2021 due to both strong demand and supply chain disruptions.

Durables were at -1.9% YoY as of December 2024, up from -2.0% YoY in November.

Commodities less food and energy commodities were at -0.5% YoY in December, up from -0.7% YoY in November.

ShelterHere is a graph of the year-over-year change in shelter from the CPI report (through December) and housing from the PCE report (through November)

Shelter was up 4.6% year-over-year in December, down from 4.8% in November. Housing (PCE) was up 4.8% YoY in November, down from 5.0% in October.

This is still catching up with private new lease data.  

Core CPI ex-shelter was up 2.1% YoY in December.

BLS: CPI Increased 0.4% in December; Core CPI increased 0.2%

From the BLS:
The Consumer Price Index for All Urban Consumers (CPI-U) increased 0.4 percent on a seasonally adjusted basis in December, after rising 0.3 percent in November, the U.S. Bureau of Labor Statistics reported today. Over the last 12 months, the all items index increased 2.9 percent before seasonal adjustment.

The index for energy rose 2.6 percent in December, accounting for over forty percent of the monthly all items increase. The gasoline index increased 4.4 percent over the month. The index for food also increased in December, rising 0.3 percent as both the index for food at home and the index for food away from home increased 0.3 percent each.

The index for all items less food and energy rose 0.2 percent in December, after increasing 0.3 percent in each of the previous 4 months. Indexes that increased in December include shelter, airline fares, used cars and trucks, new vehicles, motor vehicle insurance, and medical care. The indexes for personal care, communication, and alcoholic beverages were among the few major indexes that decreased over the month.

The all items index rose 2.9 percent for the 12 months ending December, after rising 2.7 percent over the 12 months ending November. The all items less food and energy index rose 3.2 percent over the last 12 months. The energy index decreased 0.5 percent for the 12 months ending December. The food index increased 2.5 percent over the last year.
emphasis added
The change in CPI was close to expectations. I'll post a graph later today after the Cleveland Fed releases the median and trimmed-mean CPI.

A NASA astronaut may have just taken the best photo from space—ever

People who appreciate good astrophotography will no doubt be familiar with the work of Don Pettit, a veteran NASA astronaut who is closing in on having lived 500 days of his life in space.

Pettit is now in the midst of his third stint on the International Space Station, and the decade he had to prepare for his current stay in orbit was put to good use. Accordingly, he is well stocked on cameras, lenses, and plans to make the most of six months in space to observe the planets and heavens from an incredible vantage point.

Ars has previously written admiringly of Pettit's work, but his latest image deserves additional mention. When I first saw it, I was dazzled by its beauty. But when I looked further into the image, there were just so many amazing details to be found.

Read full article

Comments

Why is achieving financial neutrality in organ donation so hard?

For some years now, many opponents of compensation for kidney donors have come to agree that at least donors should not have to bear large expenses to donate.  But it has been hard to operationalize this apparent agreement.

Here's a paper that (somewhat inadvertently) explains why.  They take the position that it is ethically allowable to compensate donors for out of pocket financial expenses, but ethically forbidden to do so in a way that might  sometime pay some donor for an expense  they might have incurred even if they hadn't donated. (So, for example, they forbid paying anything towards funeral expenses for deceased donors.)

This was a point of view that I encountered when I was on the advisory board of NLDAC, the federally funded U.S. agency that can pay some expenses for poor donors. Originally NLDAC issued special credit cards to donors who qualified, that could only be used for airfare, hotels, and restaurants, i.e. for travel and meals.  But some donors indicated that they preferred not to eat in restaurants, but to go to a supermarket and bring food back to eat in their hotel room.  NALDAC eventually decided that the special credit card could also be used for grocery stores, even though this meant that some donor might sometime buy a dozen apples, and bring the uneaten ones with them when they returned home, and thus have received some compensation in addition their travel expenses. (When I write it like that it seems that I must be exaggerating, but here's an article that argues that any inadvertent generosity to donors would cross a red line, and must therefore be avoided.)

Supporting Financial Neutrality in Donation of Organs, Cells, and Tissues, by Martin, Dominique E. PhD1; Capron, Alexander M. LLB2; Fadhil, Riadh A. S. MD3; Forsythe, John L. R. MD4; Padilla, Benita MD5; Pérez-Blanco, Alicia PhD6; Van Assche, Kristof PhD7; Bengochea, Milka MD8; Cervantes, Lilia MD9; Forsberg, Anna PhD10; Gracious, Noble MD11; Herson, Marisa R. PhD1; Kazancioğlu, Rümeyza MD12; Müller, Thomas PhD13; Noël, Luc MD14; Trias, Esteve MD15; López-Fraga, Marta PhD16,
Transplantation 109(1):p 48-59, January 2025. | DOI: 10.1097/TP.0000000000005197 

Abstract: "The avoidance of financial gain in the human body is an international ethical standard that underpins efforts to promote equity in donation and transplantation and to avoid the exploitation of vulnerable populations. The avoidance of financial loss due to donation of organs, tissues, and cells is also now recognized as an ethical imperative that fosters equity in donation and transplantation and supports the well-being of donors and their families. Nevertheless, there has been little progress in achieving financial neutrality in donations in most countries. We present here the findings of an international ethics working group convened in preparation for the 2023 Global Summit on Convergence in Transplantation, held in Santander, Spain, which was tasked with formulating recommendations for action to promote financial neutrality in donation. In particular, we discuss the potential difficulty of distinguishing interventions that address donation-related costs from those that may act as a financial incentive for donation, which may inhibit efforts to cover costs. We also outline some practical strategies to assist governments in designing, implementing, and evaluating policies and programs to support progress toward financial neutrality in donation."

" The principle of financial neutrality in donation states that donors of organs, cells, and tissues, or donor families, should neither lose nor gain financially as a result of donation.

...

"we explore concerns regarding the use of financial incentives for donation (see Box 1) and discuss potential difficulties in distinguishing legitimate coverage of costs from practices and policies that may provide financial incentives for donation and thus violate the prohibition of trade in SoHOs. We argue that anxiety regarding the use of financial incentives may, in some countries, deter or undermine efforts to remove financial disincentives from donations. 

...

"Efforts to address the costs of living kidney donation notably became a focus in North America in the wake of a decline in living donor rates in the mid-2000s, disproportionately impacting poorer populations.

...

" Like the removal of financial disincentives to donation, avoidance of financial incentives for donation is essential for achieving financial neutrality ... "Sustained efforts are thus needed to deter use of financial incentives and ensure that efforts to promote financial neutrality in donation do not create actual or perceived incentives

...

"Some proposals intended only to cover donation-related costs may inadvertently incentivize donations or result in inappropriate financial gains.

...

"After more than a decade of expressed support for the principle of financial neutrality in donation, it is time for policymakers in all countries to act in pursuit of the goal of financial neutrality."

#############

Earlier: Thursday, March 31, 2022  National Living Donor Assistance Center (NLDAC) support for lost wages and dependent care

Nick Wingfield on Sonos as an Acquisition Target (and a Juicy Tidbit Regarding a Former Apple Exec Who Wanted to Acquire Them)

Nick Wingfield, writing today’s The Briefing column for the paywalled (alas) The Information:

Sonos has always been a bit of an odd duck. There aren’t that many consumer electronics startups of its size created in the last quarter century (Sonos was founded in 2002) that have survived as independent companies. Its products are expensive relative to the wireless speakers that have flooded the market from big-name tech rivals like Amazon and no-name competitors from China. And yet, Sonos held on partly because it had a commitment to high-quality sound and an Apple-like dedication to user experience, both of which gave it a passionate fan base.

The events of the last year seem to have ruptured that relationship with many of its customers. Today, Spence’s replacement — Tom Conrad, a Sonos board member, who is now interim CEO — reportedly told staff he’s focused on repairing those relations. If he’s unsuccessful, it’s fair to wonder whether Sonos — whose market capitalization is around $1.7 billion — might be better off selling itself to a bigger rival like Amazon, Google or Apple.

Years ago, a former senior Apple executive told me he once begged Steve Jobs, who was then Apple’s CEO, to buy Sonos. Jobs wasn’t interested. A lot has changed since then, but the Sonos brand still might have enough cachet to interest a more powerful suitor.

A tidbit like that immediately set my mind racing as to who that “former senior Apple executive” was. It took me only a few seconds to make my guess: Scott Forstall. There are a few other senior Apple executives who I can imagine might have pushed Jobs to pursue an acquisition of Sonos, but none of them are “former”. They’re all still at Apple.

The only other possibility I can think of is Tony Fadell, but Fadell is a hardware guy and a builder by nature. He even titled his book Build. I think he’d have wanted to spearhead the creation of Apple’s own lineup of Sonos-like audio kit under the iPod brand, not acquire them. But Fadell is a maybe.

No one else really fits the bill. Bob Mansfeld? Bertrand Serlet? Nah. Jony Ive? Doesn’t sound right. Jon Rubinstein? He left Apple in April 2006, which I think predates Wingfield’s time covering Apple for The Wall Street Journal.

Update: Well, my first guess was wrong, but my second was right. I asked Tony Fadell and he confirmed to me it was him, saying it was back in the very earliest days of Sonos, when Sonos was set to debut with a device featuring an obviously iPod-like scroll wheel for input. Jobs wanted to sue (of course). But Fadell, after meeting with the founders, wanted to buy them, and made his case to Jobs, to no avail, several times circa 2003. Fadell said his pitch was basically “Seriously, we are all about music. Customers want this. I want this.” And Jobs’s response was, according to Fadell, “No one wants what they are selling.”

(Here in 2025, there are an awful lot of Apple users who also own an awful lot of Sonos devices who would disagree with Jobs on that.)

 ★ 

‘47 Years Later, the Palisades Disappeared Overnight’

Mike Davidson:

I grew up on Iliff Street, right in the middle of the ashes that up until a few nights ago, was a sunkissed neighborhood known as Pacific Palisades.

It was 1978, and I remember my dad climbing up on our roof with a garden hose. Every couple of hours, he would wet the house down, top-to-bottom, and everything surrounding it. I don’t remember everybody doing this, but my Dad is a Meteorologist, and back then he worked at the SCAQMD, the regional agency charged with studying, regulating, and improving air quality in Los Angeles, Orange, Riverside and San Bernardino Counties. Because of his specific remit and where we lived, he had a deep understanding of the Santa Ana winds and their effect on the Palisades.

When my dad explained what he was doing, he would point northeast to the hills behind us and tell us that if the winds didn’t die down, the fire miles in the distance would come towards our tiny little house and there would be trouble. As a small child, I don’t actually remember being scared about any of this. Every year there was a fire, the smoke was always so far away and so barely visible that it just seemed like anything else in life at the time. And besides, dads are superheroes to their children, so of course there was no danger.

What a remarkable piece of writing this is. Part memoir, part call to action, entirely engaging.

 ★ 

Volume 1 of Jack Smith’s Special Counsel Report Flatly States Trump Would Have Been Convicted in Election Case

Alan Feuer and Charlie Savage, reporting for The New York Times:

Jack Smith, the special counsel who indicted President-elect Donald J. Trump on charges of illegally seeking to cling to power after losing the 2020 election, said in a final report released early Tuesday that the evidence would have been sufficient to convict Mr. Trump in a trial, had his 2024 election victory not made it impossible for the prosecution to continue.

“The department’s view that the Constitution prohibits the continued indictment and prosecution of a president is categorical and does not turn on the gravity of the crimes charged, the strength of the government’s proof or the merits of the prosecution, which the office stands fully behind,” Mr. Smith wrote.

He continued: “Indeed, but for Mr. Trump’s election and imminent return to the presidency, the office assessed that the admissible evidence was sufficient to obtain and sustain a conviction at trial.”

The Times includes a link to the full 174-page first volume of the report. Without having read it yet, I’ll just say this. It should go without saying that Trump’s actions are Trump’s responsibility. Trump is the already convicted felon of far lesser crimes, and he should have been (and perhaps, years from now, will be) the convicted felon of the grave crimes against the nation itself that Jack Smith’s special counsel team investigated and charged him with.

But Joe Biden deserves blame for the fact that Trump wasn’t tried before the 2024 election. I take no pleasure in saying it because I like Biden, a lot, and in most other ways I agreed with his policies and his numerous accomplishments over the last four years. But with regard to Donald Trump, Biden just fucking blew it. It’s that simple. Biden wrongly believed that after the 2020 election, and exacerbated by Trump’s embarrassing refusal to accept defeat and his ham-fisted attempt at a coup-by-morons on January 6, that Trump was finished, politically. Like Nixon after Watergate, but with even deeper ignominy. Biden thought his own election was proof that the MAGA fever had broken, and the American electorate had returned to some sort of pre-Trump “normalcy”. So Biden appointed Merrick Garland, a feckless cowardly fool, as Attorney General, and under Biden and Garland’s direction the Justice Department slow-walked the pursuit of justice against Trump for his crimes, thinking it would be better for the nation — a nation, again, that Biden plainly but wrongly assumed was ready to put Donald Trump in the ash heap of history — not to aggressively prosecute Trump as though time was of the essence, so as to remove any possible appearance that they were pursuing his prosecution for political reasons.

What a grave mistake. I hope it winds up not mattering much in the grand scheme of history, but there’s a pit in my stomach telling me it will. Jack Smith wasn’t appointed Special Prosecutor by Garland until 18 November 2022. Smith was the right man for the job, but he should have been appointed at the very start of the Biden administration in early 2021. That year and a half of abject dithering was the difference between putting Trump on trial and convicting him of the crimes we literally watched him commit on TV, and seeing Trump run out the clock with procedural delays until he had the chance to be reelected, which he was. And now here we are on the cusp of Trump serving a second term in the White House without ever standing trial for his serious crimes against the nation. The urgency was dire, but Biden and Garland acted as though they had all the time in the world, until they realized their mistake far too late.

If Trump 2.0 goes mostly like Trump 1.0 — a daily stream of chaotic talk, but very little chaotic action to speak of, leaving an electorate to quickly tire of his antics and turn against him in the midterms two years hence — the dark mark on Biden’s historical legacy will likely be his stubborn refusal (and/or cognitive inability) to recognize that age had caught up to him, leading him to run for a reelection and drop out only after embarrassing himself in a debate, leaving Democrats no time to hold a proper primary election to choose a popularly-elected candidate for 2024. But if Trump 2.0 is unlike Trump 1.0, and is filled with actions that leave a lasting mark on the nation and the world, history will remember Biden for allowing Trump to get off the hook for obvious crimes against democracy itself.

Biden is like the protagonist in a horror movie who defeats the villain but doesn’t finish him off, congratulates himself, and turns his back on his foe and starts walking off into the sunset. All the while, with the audience screaming, “Finish him off! He’s getting back up! Turn around! Oh god, I can’t watch...”

 ★ 

Uncanceled Units

Speed limit c arcminutes^2 per steradian

MBA: Mortgage Applications Increased in Weekly Survey

From the MBA: Mortgage Applications Increase in Latest MBA Weekly Survey
Mortgage applications increased 33.3 percent from one week earlier, according to data from the Mortgage Bankers Association’s (MBA) Weekly Mortgage Applications Survey for the week ending January 10, 2025. Last week’s results included an adjustment for the New Year’s holiday.

The Market Composite Index, a measure of mortgage loan application volume, increased 33.3 percent on a seasonally adjusted basis from one week earlier. On an unadjusted basis, the Index increased 52 percent compared with the previous week. The Refinance Index increased 44 percent from the previous week and was 22 percent higher than the same week one year ago. The seasonally adjusted Purchase Index increased 27 percent from one week earlier. The unadjusted Purchase Index increased 48 percent compared with the previous week and was 2 percent lower than the same week one year ago.

“Bond yields in the U.S. and abroad continued to move higher in response to concerns over a sticky inflation outlook and still too-high budget deficits, which pushed mortgage rates higher for the fifth consecutive week. The 30-year fixed rate is now at 7.09 percent – its highest level since May 2024,” said Joel Kan, MBA’s Vice President and Deputy Chief Economist. “This time of the year is a particularly volatile time for application volumes, so it can be more helpful to focus on the level rather than the percent change. Purchase applications were 2 percent lower, and refinances were 22 percent higher compared to a year ago. Total applications were up by 33.3 percent, the highest level in a month, as both purchase and refinance applications saw large percentage increases over the week.”
...
The average contract interest rate for 30-year fixed-rate mortgages with conforming loan balances ($766,550 or less) increased to 7.09 percent from 6.99 percent, with points decreasing to 0.65 from 0.68 (including the origination fee) for 80 percent loan-to-value ratio (LTV) loans. The effective rate increased from last week.
emphasis added
Mortgage Purchase IndexClick on graph for larger image.

The first graph shows the MBA mortgage purchase index.

According to the MBA, purchase activity is down 2% year-over-year unadjusted. 

Red is a four-week average (blue is weekly).  

Purchase application activity is up about 29% from the lows in late October 2023 and is now 7% above the lowest levels during the housing bust.  

Mortgage Refinance Index
The second graph shows the refinance index since 1990.

The refinance index is very low.

Three Cheers for Blue Origin … No Really

Here’s a sort of update from the world of billionairedom. Today Blue Origin, Jeff Bezos’ space company, was set to attempt its first launch of its hulking “New Glenn” rocket. But they’ve now scrubbed that attempt because of some technical issues and they’re going to try again on Thursday. Blue Origin is either 100% owned or near 100% owned by Bezos. It’s unclear whether some very limited equity may have gone to some early employees. But big picture: it’s Jeff Bezos’ company. It’s not part of Amazon or some public company. It’s his.

The company now seems to be Bezos’ main focus and he’s apparently relocated to Florida to give the company his especial attention. While all space technology is of interest to me, normally I wouldn’t be rooting for a new Bezos business venture. I have no particular beef with Bezos. But as we’ve seen repeatedly in recent months and years, what we might call the super-billionaires have way, way too much power. But in this case I’m really hoping this launch succeeds and that Blue Origin makes big strides in general.

I’m doing this post to explain why.

This is because right now the U.S. government and, in a way, the world is insanely dependent on SpaceX, Elon Musk’s space company. As you know, SpaceX was gotten off the ground with a lot of government subsidies and contracts. But now the U.S. government is wildly dependent on it and it’s become a critical private-sector player in the near-earth-orbit delivery business. It might not have been possible without the U.S. government’s start-up money and obviously it’s not like Elon Musk developed anything himself. But what it’s important to understand, as far as I’ve been able to understand it, is that the technology is genuinely transformative. I don’t think there’s any one key invention. Transformative in this sense is taking a lot of 21st century technology and getting it to work consistently and economically at scale. In addition to now having a very reliable and cost-competitive project that governments and companies around the world want to use, it’s also allowed Musk to put thousands of his own satellites into orbit. As The New York Times explained in this article back in 2023, more than half the functioning satellites in orbit today are owned and controlled by SpaceX and Elon Musk.

Think about that: more than half the working satellites in space (so not counting the derelict dead ones still floating around) are controlled by this one guy. That’s what Starlink is based on. Starlink plus the space delivery service isn’t just producing an insane amount of wealth for Musk, it’s also producing a kind of power that transcends wealth, though wealth on his scale creates a kind of power far greater than any one person should have.

When you’re a U.S. military contractor, you never have total control of your technology. The U.S. government could in various ways dictate what SpaceX can and can’t do. The same goes for Starlink. And I think in the coming years we’ll need to be thinking about doing that. But in practice those aren’t muscles the U.S. government is used to flexing. There are already various examples of Musk operating in ways that no other defense contractor with a security clearance would be allowed to act. We saw this when Musk was playing footsie with Vladimir Putin over Starlink and Ukraine. I have thought for the last year that we should see Musk’s acquisition of Twitter and heavy role in the presidential election as part of this — making himself too big to touch as it were. We’re seeing an example of this right now in the UK. Musk is really, really disliked in the UK. People hate him — way, way more than in the U.S., where he’s more a 50-50 thing. But as the Times notes here, that hasn’t prevented him from basically creating a governing crisis over a years old anti-immigrant-infused grooming scandal.

This is just one example of the way that Musk has become something unique even among super-billionaires: his mix of ultra-wealth combined with key holds over communications and national security-technology has made him function more like a state than perhaps any individual in human history.

Which brings us back to Blue Origin. We need to be trimming back, not expanding, the power of the super billionaires. But as far as I can tell, Bezos is the only other player in any position to compete with Musk in space, certainly the only other American. I followed SpaceX and Blue Origin back when they were both still in the aspirational and testing stages. And my impression over the last couple years was that Musk had basically won that battle. All we’ve seen from Blue Origin are those pay-for-a-seat suborbital flights. But Bezos and Blue Origin have by no means given up. SpaceX clearly got there first, but from my admittedly cursory knowledge of the two companies’ technology there’s no clear reason Blue Origin can’t get there too. Amazon has something called Project Kuiper, which is essentially a competitor to Starlink. For the moment they’re having to get started using other launch systems, including SpaceX’s, to get their satellites in orbit.

On the question of technology, I don’t think SpaceX has any breakout technology that they’ve developed or over which they have sole possession. The triumph of SpaceX is taking something that contemporary technology should make possible and actually getting it to work consistently and economically in real life. And in case anyone is thinking that’s a knock, in the real world that critical step is what separates the winners from the also-rans. That’s a big, big deal. But I don’t get the sense that there’s any obvious reason that Blue Origin can’t accomplish the same thing. Jeff Bezos is one of the very few people in the world who has enough money to keep at it as long as it takes. And it seems like it’s his number one focus. Remember that he retired from being CEO of Amazon a few years ago.

In any case, having two super billionaires who have a dominant position in putting things into space isn’t great. But it’s vastly better than having a single super billionaire who does. And that’s especially the case when the single guy is Elon Musk.

Confirmation Theater and Press Credulity

As the Hegseth hearings unfold, I wanted to give you a view into a small part of the story which, while perhaps not terribly consequential in itself, sheds some additional light on the Trump team’s effort to lock down details about Hegseth’s background as well as general press credulity about the same. This morning’s Axios reports that the Trump transition’s “red line” is that only Armed Services Committee Chair Roger Wicker (R-MS) and Ranking Member Jack Reed (D-RI) should be briefed on Hegseth’s FBI background check, not the rest of the committee. “The Trump transition team is demanding the president-elect’s nominees be treated the same way they insist Joe Biden’s were,” it reads.

The Axios piece doesn’t say so explicitly, or at least it doesn’t in its current version. But it certainly gives the impression that Biden negotiated this restrictive standard and that it is now being used against Democrats. The Daily Beast, for instance, summarized Axios’ reporting by claiming this rule would follow “a precedent which [Trump transition officials] say was set by Biden’s transition after the last election.” (The Axios piece says it’s been revised. But I wasn’t able to tell whether that language has been changed.)

In any case, what we’re seeing here is that the Trump team is trying to push the idea that this is some kind of binding precedent and one that Biden’s team created. Fair is fair after all. But Senate veterans tell me that’s not the case. This isn’t something the Biden people insisted on or negotiated. It’s just standing practice. Why would the Biden folks do that anyway given their generally squeaky clean nominees? If the Biden transition had pushed for something like that I have little doubt we would have heard about it at the time from Republicans. And we didn’t. Again, longtime Senate hands tell me there’s no rule in any case — just default practice and a question that is treated on a case by case basis. If there’s a problem, more people get to see it. Obviously a senator can say, if you want my vote I want to see more.

What we’re seeing here is that the Trump team is trying to rope in a fairness or grievance issue vis a vis Biden to make their case. Because, as we know, in Trump world everything is about purported grievances and unfairness to Trump. Everything gets put through the Trump grievance machine.

I mentioned up above that whatever this “precedent” is probably isn’t terribly consequential in the big picture. As we’ve learned in recent years, and as this new story by Jane Mayer in The New Yorker makes clear, on a few fronts these FBI background checks aren’t quite all their cracked up to be or what many civilians, including myself, used to assume. They’re fairly cursory and even more cursory if the people requesting them want to keep them that way.

Breakfast With Pete Hegseth

Jane Mayer, reporting for The New Yorker:

As recently as the spring of 2023, according to an account shared last week with The New Yorker, Hegseth ordered three gin-and-tonics at a weekday breakfast meeting with an acquaintance in Manhattan. “It was an extremely strange experience,” his companion that morning told me. “We met at Fox News in New York for breakfast, and he suggested we go across the street to a bar. It was, like, ten in the morning. Then he ordered two gin-and-tonics at the same time for himself. To be polite, I ordered one, too. But it was so strong I couldn’t drink it, so I ordered coffee. Then he had a third gin-and-tonic. I don’t know how he could pass a security clearance. But they’re trying to create a culture where whistle-blowers are uncomfortable coming forward.”

Until now, most American would have agreed that the secretary of defense ought not imbibe like the president of Russia on a state visit to Washington. A new wrinkle for the 2.0 kakistocracy.

 ★ 

While CTO at Pandora, Tom Conrad Had Them Building Their iPhone App Before the iPhone SDK Was Released

Speaking of new (“interim”) Sonos CEO Tom Conrad and Scott Forstall, here’s an interesting anecdote from Tyler Hall’s terrific piece for Motherboard in 2021, “How Pandora Won Its Royalty Battle but Lost the War to Spotify”:

After pushback on only allowing web apps for the iPhone, Steve Jobs announced that native apps would be coming to the iPhone. In the interim, Apple Senior Vice President Scott Forstall invited Tim Westergren and his CTO, Tom Conrad, over to a local Cupertino lunch spot. The trio talked for hours about what Pandora had learned about streaming audio from putting apps on flip phones, like Motorola’s RAZR, for wireless carriers. The meeting ended with a question for Forstall.

“What, if anything, can we do at Pandora to get ready for the next generation of iPhone that includes an app store and native APIs?” asked Conrad. “Forstall said, it wouldn’t be a waste of your time to jailbreak some iPhones and use the kind of back door toolkits that were being distributed by other people to build a native Pandora app while we get our act together at Apple on something more formal.”

So, Conrad, designer Dan Lythcott-Haines, and many others on the team got to work jailbreaking iPhones and working on a Pandora iPhone app ahead of the official APK release. Then, on day one of the App Store launch, Pandora was the first internet radio app available. Nine months later the Pandora app was installed on 21 percent of iPhones.

I first linked to this article back in 2021, when it was published, but it seemed perfect for a re-link now in light of Conrad’s new role at Sonos. The more I learn about Conrad, the more he sounds like the right man for the job there.

(Via Tyler Hall himself, on Bluesky, which you should join if you haven’t already.)

 ★ 

Sonos’s Reboot Continues: Chief Product Officer Maxime Bouvat-Merlin Is Out Too

Sonos interim CEO Tom Conrad, in a company-wide memo obtained by The Verge:

With my stepping in as CEO, the Board, Max, and I have agreed that my background makes the Chief Product Officer role redundant. Therefore, Max’s role is being eliminated and the Product organization will report directly to me. I’ve asked Max to advise me over the next period to ensure a smooth transition and I am grateful that he’s agreed to do that. [...]

I shared this news openly with the Sonos leaders yesterday with the intention that these leaders would share the update as needed with their teams. Unfortunately this news quickly made its way outside the organization. While this is frustrating for all of us, I will not let the possibility of a leak change our ability to communicate openly with one another. So I’m going to keep telling you the truth.

I know this is a lot of change to absorb in two days and I want to thank you for your resilience, continued commitment to Sonos and support of each other during this time.

Starting to sound like Conrad is as much “interim” CEO as Steve Jobs was in 1997.

 ★ 

Thinking About the Confirmations

There are a few things that are critical to understanding the Trump cabinet nominations and how Senate Democrats should approach them. The first and most important is that in the case of every nomination the question is entirely up to Republicans. Republicans have a three seat majority. They have the vote of the Vice President in a tie. What happens or doesn’t happen is entirely a matter decided within the Republican caucus. It is totally out of Democrats’ control. What follows from that is that everything Democrats do, inside the hearing room or outside is simply and solely a matter of raising the stakes of decisions Republicans make and raising those stakes for the next election. The aim isn’t for any Democratic senator to try to claw their way through the steel wall of Republican loyalty to Donald Trump. It’s to do everything they can to illustrate that Donald Trump staffs his administration with unqualified and/or dangerous toadies and that Senate Republicans are fine with this because they put loyalty to Trump over loyalty to country.

This all sounds obvious. And it is obvious. But people struggle to see the obvious as obvious. I’m seeing headlines and comments that Democrats failed to change the dynamic or knock any Republicans free. That’s a crazy standard since the dynamic is set. None of this is about whether Hegseth gets confirmed. Republicans control that. It’s about establishing the record Republicans will be running on in 2026 and the stakes for every Senate Republican in a competitive election.

Yutu-2 rover likely immobile on the moon after historic lunar far side mission

A close-up image of the Yutu-2 rover taken by the Chang’e-4 lander, showing the six-wheeled rover on the gray, uneven lunar surface of Von Kármán crater. The rover’s solar panels are extended, reflecting sunlight, with tracks visible in the regolith behind it. The barren, cratered terrain of the Moon's far side stretches into the distance.

HELSINKI — China’s Yutu-2 rover, part of the first ever mission to land on the far side of the moon, may have made its final tracks, NASA lunar orbiter images […]

The post Yutu-2 rover likely immobile on the moon after historic lunar far side mission appeared first on SpaceNews.

Loft Orbital raises $170 million to expand space infrastructure service

Loft Orbital has raised $170 million to expand manufacturing facilities and streamline operations with more artificial intelligence.

The post Loft Orbital raises $170 million to expand space infrastructure service appeared first on SpaceNews.

Varda’s second mission launches with U.S. Air Force payload

The company’s W-2 capsule launched aboard SpaceX's Transporter-12 rideshare mission

The post Varda’s second mission launches with U.S. Air Force payload appeared first on SpaceNews.

Tomorrow.io NextGen offers high-resolution rain forecasts

Tomorrow.io satellites feed NextGen, a tool that offers global precipitation forecasts with a resolution of 2.5 kilometers.

The post Tomorrow.io NextGen offers high-resolution rain forecasts appeared first on SpaceNews.

Rosotics pivots to focus on orbital transport vehicles

Rosotics seeks to build a propellant depot at Earth-moon Lagrange Point 5.

The post Rosotics pivots to focus on orbital transport vehicles appeared first on SpaceNews.

Defense Department’s new tool to investigate on-orbit anomalies

Cloud-based tool to help military satellite operators determine if space weather caused on-orbit anomalies.

The post Defense Department’s new tool to investigate on-orbit anomalies appeared first on SpaceNews.

Falcon 9 launches American and Japanese commercial lunar landers

F9 launch Firefly ispace

A Falcon 9 successfully launched Jan. 15 two landers built by American and Japanese companies taking different paths to the surface of the moon.

The post Falcon 9 launches American and Japanese commercial lunar landers appeared first on SpaceNews.

SpaceX launches 131 payloads on Transporter-12 rideshare mission

Transporter-12 liftoff

The latest SpaceX rideshare mission Jan. 14 deployed satellites ranging from dozens of imaging satellites to reentry vehicles, tugs and even a “selfie sat.”

The post SpaceX launches 131 payloads on Transporter-12 rideshare mission appeared first on SpaceNews.

Phishing False Alarm

A very security-conscious company was hit with a (presumed) massive state-actor phishing attack with gift cards, and everyone rallied to combat it—until it turned out it was company management sending the gift cards.

Upcoming Speaking Engagements

This is a current list of where and when I am scheduled to speak:

  • I’m speaking on “AI: Trust & Power” at Capricon 45 in Chicago, Illinois, USA, at 11:30 AM on February 7, 2025. I’m also signing books there on Saturday, February 8, starting at 1:45 PM.
  • I’m speaking at Boskone 62 in Boston, Massachusetts, USA, which runs from February 14-16, 2025.
  • I’m speaking at the Rossfest Symposium in Cambridge, UK, on March 25, 2025.

The list is maintained on this page.

Since you arrived, my heart stopped belonging to me

Photo of three women viewed from behind near a lake with trees. One holds a megaphone and wears a colourful embroidered shirt.

‘We share and feel the same pain’: the mothers looking for their children who disappeared in Mexico en route to the US

- by Aeon Video

Watch at Aeon

The civilization survival scale: A biological argument for space settlement

Some space advocates have argued that space settlement is vital to ensure the survival of humanity. Thomas Matula describes a scale for measuring the abilities of civilizations to survive that could be useful for space advocacy and for astrobiology.

Review: Manned and Unmanned Flights to the Moon

There is renewed interest in lunar exploration, including the launch this week of two commercial lunar landers. Jeff Foust reviews a book that provides an overview of the history of lunar exploration, but focuses on many missions that never attempted to go to the Moon.

A Manifesto in Defense of Courtship

I can’t hide it—my romance credentials are a bit rusty.

The last time I went on a first date was, let’s see. . . [Ted checks his calendar]. . . okay, it was April 14, 1990.

Gas was 78 cents per gallon back then. The Berlin Wall was still standing. And “The Humpty Dance” kept playing on the radio (but never for long—because I’d change the station).

And on that chilly April evening, I was sitting (for the first time) with my future wife at a restaurant in Manhattan.


If you want to support my work, please take out a premium subscription (just $6 per month).

Subscribe now


That was a long, long time ago….

So If I possess any high level game in dating, it would be like those other forgotten games—cribbage, backgammon, Parcheesi—gathering dust in the attic.

But I must have done something right in the romance game. Tara and I hit it off, and we got married 16 months later.

We have continued to go out on romantic dates during the intervening 34 years.

Here we are, Tara and I, back in the day.

And I do have some theoretical expertise in the subject, as well.

I devoted a decade, more or less, to researching a multidisciplinary book on love songs.

This forced me to learn the history of dating, marriage, and sexuality—going back thousands of years.

So I read all the experts on the subject—Stendhal’s Love, Ovid’s Ars Amatoria, Plato’s Symposium, Dante’s La Vita Nuova, Michel Foucault’s The History of Sexuality, Erich Fromm’s The Art of Loving, José Ortega y Gasset’s On Love, Andreas Capellanus’s The Art of Courtly Love, C.S. Lewis’s The Allegory of Love, and lots of others.

I’ve read too many books about love.

I learned stuff on love you wouldn’t even dream of.

I memorized the naughty details of the Sacred Marriage Ritual of ancient Sumer (2300 BC). I kept careful notes on what Socrates learned about love from a clever woman named Diotima—although it didn’t seem to help his marriage. I studied the great seducers, and found that they often play the lute. And…

Well, I could go on and on.

Watch out for lute players!

But all that history is beside the point. That’s because I want to talk about the state of romance today.

It must be in short supply, because I hear lots of complaints from singles. They tell me it’s not easy to find a good partner for a relationship.

I don’t pry. But people want to tell me things. The latest incident was yesterday evening.

Tara was out of town, so I went to a restaurant with only a novel for company—Sally Rooney’s Intermezzo. A young guy took my order, but then he started up this conversation.

WAITER: So what’s the book about?

TED: About young people in relationships. But they always have problems—otherwise it wouldn’t be much of a story.

WAITER: You mean like dating apps and texting and that kind of stuff?

TED: Yes, that’s it. I don’t know much about any of it—I’ve been married for decades. The only thing I swipe is the towel when I’m drying dishes.

WAITER: Let me tell you, it’s really a nightmare. People from your generation have no idea.

TED: I’d think all the apps would make it easier.

WAITER: No, no, no. It just creates so many distractions. You had a much simpler situation, with just a small network of people.

TED: Yes, I met my wife on a blind date—we were set up by a mutual friend.

WAITER: It’s different now. Just too many distractions….

I could tell that he had strong feelings about this subject. I think he wanted to talk about it more, but his boss gestured for him to take care of another table. Otherwise he would have had a lot more to say.

I should add that he is a young, good-looking guy. It’s hard to imagine him having much trouble on the dating scene. But he clearly was.

I’ve heard similar stories from so many other people lately. They hate dating apps, but they don’t know any better way of finding somebody.

And how do you break out of the app cycle? Some people are starting to take desperate measures.

I keep coming back to the paradox. Apps make things easier—so why are relationships getting harder?

As I mull this over, I kept coming back to something I studied while researching the history of love songs. But it’s never mentioned nowadays.

It’s called courtship.

When was the last time you heard somebody use that word?

Maybe you hear it in a movie about romances from the distant past. But you can watch a whole season of The Bachelor or The Real Housewives of Beverly Hills, and courtship won’t make a single appearance.

I bet the spellcheck on my iPhone won’t even recognize it.

Courtship? Did you mean to write Court-Issued Restraining Order?

It seems such a silly and old-fashioned concept. Why bother with courtship when it’s faster to do a hookup?

Hey, I’m no prude. I’m not doing abstinence training here. Nobody’s saying you can’t do what you’ve got to do.

Nobody's saying you can't do what you've gotta do

But let’s give courtship its due.

  1. It brought couples together for more than a thousand years. We wouldn’t be here today if it wasn’t for those courting couples.

  2. The rules of courtship embrace ritual and gracious behavior. These “rules of the game” provide guidance and impart structure (as well as add spice) in an otherwise unstable process.

  3. You become better and more attractive yourself by following these rules—because they impose a discipline and aura of courtliness on your own actions.

  4. They also provide a sense of safety for both participants.

  5. The notion behind courtship is that love is stronger when creating a relationship is harder. That’s true in other spheres of life—sports training, musicianship, education, etc. And it’s easy to understand why: We get stronger at anything by avoiding shortcuts and taking on challenges.

  6. Courtship moves step-by-step, and thus provides a chance for ongoing reflection and learning, as well as an easy exit path, before things get too complicated.

  7. But here’s the most important reason for courtship: It fosters an attitude of respect, appreciation, and courtesy between the two people.

And that’s exactly what they will need if they decide to build a lasting relationship. You set the foundation for the future with this respect.


You might notice that I have not mentioned religion or morals here.

I could easily do that—but I think it’s useful to remember that courtship has intrinsic value. So I don’t need to cite sacred texts for validation.

You practice these rules because you benefit from them. When you bypass them, you often hurt yourself—and the other person, too.

Sure, there’s a bit of fantasy involved here. In a courtship you view the other person in an idealized way—up on a pedestal—magnifying their good qualities, and forgetting, perhaps, about their flaws.

But that’s not a bad basis for a solid relationship. You will need to do some of that idealizing on a regular basis if you’re looking for a successful marriage.

There’s a good reason why this kind of idealized courtship still shows up in Hollywood movies. It’s because it creates an intense atmosphere of romance.

By the way, I note that the word courtesy comes from the same root as courtship—deriving from the courts of nobles. When you participate in a courtship, your behavior is elevated, and you are literally acting like a King or Queen.

The troubadours of the late medieval period invented the rules of courtship. And this was the most exciting thing that had happened in Western culture in a thousand years.

It changed everything.

Songs were different after courtship was invented—both more romantic and more realistic. Stories were also different, and so were poems. Even religion changed in the aftermath—the cult of the Virgin Mary in Christendom was a deliberate merging of courtship and spirituality.

It was such an important discovery, that people continued to imitate the courtships of brave knights and fine ladies long after court society disappeared.

When the novel was invented, Cervantes had to convince readers that the stories of courtly love were no longer believable. That was a key reason why he wrote Don Quixote.

But readers didn’t want to give up these stories of courtship. Two hundred years after Don Quixote, Jane Austen was still creating new variations on the old formula.

Many contemporary romance novels are still working from this playbook.

Nobody uses a dating app in these books.

The appeal of these tales is obvious.

I think we all crave a little more of this in our relationships.

Photo of tweet

Maybe this is just one more example of apps creating shortcuts, when we might be better off with a longer, more ceremonial process.

We all know that slow food tastes better than fast food. And that’s true in many other pursuits—good things take time. If you care about results, you don’t rush.

So why not try the same in relationships.

We don’t have to slow down to Dante’s pace—nine years elapsed between Dante falling in love with Beatrice and the first time she greeted him. But I note that Dante mentions this fact in the most casual manner, as if the slow pace of his courtship were the most normal thing in the world.

This step-by-step process happened everywhere back then, not just in Europe. Consider this account of courtship among the Omaha tribe, from researcher Alice Fletcher:

A young man of the tribe watches his beloved from afar….

and at the dawn his love-song may be heard echoing over the hills. Sometimes he sings in the evening to let the maiden know of his presence. Girls find ways of learning who are the young men seeking them, and they also in their turn watch these lovers secretly and either flirt a little or entertain a serious regard for the young wooer. All this little drama takes place covertly, no elder is made a confidant.

This sounds like it’s straight out of the Romeo and Juliet playbook.

You can see the same thing in The Tale of Genji, written a thousand years ago in Japan by Lady Murasaki. Men schemed endlessly just to get a glimpse of the beloved.

It’s probably wise to speed things up a bit nowadays. We don’t all have nine years to wait for a greeting from Beatrice.

But swiping through partners on an app, with a couple seconds devoted to each profile, is probably too fast. Let’s give this serious matter the time it deserves.

Am I out-of-touch? Probably?

Is my dating game rusty? Certainly.

But I don’t think I’m unrealistic. It’s still possible to have courtship in the 21st century.

We don’t need an act of Congress. We don’t need a Supreme Court ruling. We don’t need to change the world.

Like so many problems in society now, our solution is at hand—but only for those who are willing to operate in defiance of the prevailing dysfunctional trends.

That’s the new reality. In the year 2025, the group mindset fails, but the indie mindset prevails.

After all, it only takes two people to start a courtship. What could be more realistic than that?

Two (or more) ways to get samples back from Mars

Last week, NASA announced it would study two different ways to pick up the samples the Perseverance rover is collecting on Mars and return them to Earth. Jeff Foust reports on the two approaches as well as interest by at least one company in an alternative.

Returning humans to the Moon without SLS and NRHO

There is speculation the Trump Administration may attempt to cancel the Space Launch System. Ajay Kothari offers an alternative architecture that could get humans back to the Moon without either SLS or Starship.

Anonymous Attention and Abuse

There is a new paper by Florian Ederer, Paul Goldsmith-Pinkham, and Kyle Jensen.  It concerns what happens on EJMR, and abusive rhetoric in on line economic discourse.  I have not yet read it, but very likely it is of interest.  Here is the direct paper link.

The post Anonymous Attention and Abuse appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Kent Beck: Illuminating the Pattern Language of Modern Life

My art has become an increasingly-important part of my life. I’ve never taken it seriously, though. I asked an art-knowledgeable friend to help me explain what I do in terms their community might understand. Here’s the result.

In an era of ironic detachment, Kent Beck's work stands as a testament to earnest exploration and technical virtuosity. Working primarily in acrylic on glass and mirrors, Beck reinterprets Art Deco's techno-optimism for our digital age, while maintaining a profound connection to traditional artistic processes.

Cityscapes

Beck's cityscapes, rendered through an innovative twist on églomisé, strip urban vistas to their essential element: light. These works capture moments of precarious impermanence – each point of illumination representing both presence and potential absence.

Abstracts

His abstracts begin with a singular gestural impulse, expanding into intricate systems of pattern and color. Maps of the artist’s inner world, these pieces operate under an internal logic: a self-imposed rule where identical colors never touch across different patterns, creating a visual harmony that emerges as naturally as a mathematical proof.

This interplay of pattern and color reaches its full potential when executed on mirrors. The patterns create multiple layers of visual information – the painted surface, the viewer's reflection, and the space behind the viewer. The viewer cannot escape becoming part of the image viewed, collaborating, consciously or not, with the artist. Art is transformed from a static picture into a dynamic dance—art as verb, not noun.

Process

Working in glass and mirror demands absolute commitment – there is no "undo", no ability to revise or erase. This irreversibility stands in stark contrast to Beck's background in software development, yet draws upon the same deep understanding of how complex systems evolve from simple beginnings. The results are works that delineate space, compressing the aesthetic whirlwind behind modern software constructs into two dimensions, while the mirror surface adds a third dimension of real-time human interaction.

Beck's work troubles the artificial division between digital and analog realms, highlighting the already-augmented nature of reality and perception. His cityscapes, reduced to points of light against darkness, become meditations on human presence in an increasingly technological world. His abstracts expose patterns of thought as misleadingly-precise lines, colors, and shapes. His work suggests that our attempts to separate human from machine, analog from digital, viewer from art, are themselves patterns we impose on a more complex reality.

Conclusion

In both his nightscapes and abstracts, Beck presents art that provides multiple levels of engagement, from immediate visual pleasure to deeper contemplation of how we perceive and organize our modern world. His work stands as a bridge between technological precision and artistic intuition, offering viewers not just a view into, but active participation in the patterns that underlie contemporary existence.

Thinkie: The Real Question

Pattern: Someone asks you for information that makes no sense.

Transformation: Ask, “What’s the real question?”

Fred George suggested Thinkie The Real Question after reading Thinkie Legibility. It’s another way to get to the actual issue.

Here’s the thing, though. Sometimes people don’t want to get to the real issue. They know their fears are absurd, maybe…

Read more

Remembrance of Red State Bailouts Past

Remember this? The picture happens to be from Baltimore, but Texas was the epicenter of the crisis:

I received a lot of good feedback over yesterday’s post about how much California has contributed to the U.S. economy over the years, starting with the way California has in effect subsidized poorer, less productive (and yes, generally Republican-voting) states through the federal tax-and-transfer system. It seems that many generally well-informed readers weren’t aware just how big these subsidies are.

I hope that my post made it clear that I’m fine with this cross-subsidization; it’s part of what being a nation is all about. It was right and appropriate for California to aid red states in the past; what’s wrong and shameful is the push by politicians from those states to deny California aid in its own hour of need.

One thing I didn’t point out, however — to be honest, because it slipped my mind — is that our system also provides a safety net for states facing economic or financial stress.

For example, bank deposits are insured at a national level, so a state that for whatever reason experiences a wave of bank failures is effectively compensated for its losses by taxpayers nationwide. If a state experiences a recession, the impact of that slump is diminished in part by the fact that it pays less in taxes while Social Security and Medicare money continues to flow in, in part by the fact that the state gets extra aid from unemployment insurance and means-tested programs like Medicaid and food stamps.

In short, our system more or less automatically bails out states when they run into financial trouble. And as it happens, the biggest such bailouts I’m aware of involved red states.

First up, Texas in the 1980s, which was ground zero for the savings and loan crisis. I found myself thinking about that crisis in 2012, the peak year of the European debt crisis, as a benchmark. Here’s what I wrote:

Something I’ve been looking at: Texas after the savings and loan crisis of the 1980s.

The cleanup from that crisis cost taxpayers about $125 billion, back when that was real money. As best I can tell, around 60 percent of the losses were in Texas. So that’s around $75 billion in aid — not loans, outright transfer.

Texas GDP was about $300 billion in 1987. So this was equivalent to giving — not lending, not even taking an equity stake — Spain 25 percent of its GDP to bail out its banks.

And in the US it wasn’t even treated as an interstate political issue.

Back then, as you can see, I was thinking about Europe. But put it in the current context. California’s GDP is more than $4 trillion. So giving CA a Texas 1987 scale bailout would mean giving it roughly one trillion dollars. [/Dr. Evil]

Then there’s Florida after the 2000s housing bubble. Florida was at the heart of that bubble, and was hit hard when it burst. But not as hard as it would have been if it hadn’t benefited from the de facto safety net our system provides. At the time I found it useful to compare Florida with Spain — two economies with warm climates, with large numbers of holiday homes built near the sea. Housing prices first soared, then crashed, in both places; even the numbers were broadly similar.

Spain, however, experienced a severe post-bubble recession and went through many years of very high unemployment. Florida was hurt too, but the slump was both much shallower and much shorter. The main reason, I argued (gated; sorry) was the fact that Florida had the Federal safety net. It paid a lot less in federal taxes during its slump, but Social Security and Medicare checks kept coming, while unemployment and food stamp checks actually got bigger.

Oh, and the FDIC spent $9.7 billion compensating depositors at failed Florida banks; Fannie Mae and Freddie Mac, the government-sponsored lending agencies, also absorbed large losses on Florida mortgages, although I haven’t managed to put a number to them.

Putting all of this together, I’m pretty sure that post-bubble Florida received de facto federal aid of at least $50 billion and probably considerably more; scaling that up by the size of the state economy, that would be the equivalent of giving California today at least $250 billion.

Now, providing relief for states experiencing natural disasters isn’t a legal obligation like compensating insured depositors, nor is it an automatic mechanism like the way a slumping state pays lower taxes while receiving increased benefits. But the principle is the same, and it would be a break with both traditional practice and fundamental American values to deny aid to California because it voted Democratic, or make that aid contingent on accepting G.O.P. policy demands.

All indications, however, are that Republicans intend to exploit the tragedy in Los Angeles, and in general turn the federal government into an extortion racket. Let’s not pretend otherwise.

MUSICAL CODA

Most readers seemed to like Molly Tuttle and Golden Highway, but a few asked for more of a Los Angeles sound. Will Linda Ronstadt do?

Should the U.S. recognize Somaliland?

I do not myself have a position on this issue, but I found this analysis by Ken Opalo interesting:

The main argument below is that while the people of Somaliland deserve and have a strong case for international recognition, such a development at this time would very likely take away the very incentives that have set them apart from the rest of Somalia over the last 33 years.

To be blunt, achieving full sovereignty with de jure international recognition at this time would do little beyond incentivizing elite-level pursuit of sovereign rents at the expense of continued political and economic development. What has made Somaliland work is that its elites principally derive their legitimacy from their people, and not the international system. Stated differently, full sovereignty runs the risk of separating both the Somaliland state and ruling elites from the productive forces of society; which in turn would free politicians (and policymakers) from having to think of their people as the ultimate drivers of their overall economic wellbeing. Just like in the rest of the Continent, the resulting separation of “suspended elites” from the socio-economic foundations of Somaliland society and inevitable policy extraversion would be catastrophic for Somalilanders.

The last thing the Horn needs is another Djibouti — a country whose low-ambition ruling elites are content with hawking their geostrategic location at throwaway prices while doing precious little to advance their citizens’ material well-being (Djibouti’s poverty rate is a staggering 70%).

There is much more at the link.

The post Should the U.S. recognize Somaliland? appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

There was a straight shot from Earth to the Moon and Mars last night

I almost missed it. Amid a bout of prime-time doomscrolling, a social media post reminded me there was something worth seeing in the sky. Mars disappeared behind the full Moon for a little more than an hour Monday night, an event visible across most of North America and parts of Africa.

So I grabbed my camera, ran outside, and looked up just as Mars was supposed to emerge from the Moon's curved horizon. Seen with the naked eye, the Moon's brightness far outshined Mars, casting soft shadows on a cold winter evening in East Texas.

Viewing the Moon through binoculars, the red planet appeared just above several large partially shadowed craters at the edge of the Moon's curved limb. I quickly snapped dozens of photos with my handheld Canon 80D fitted with a 600 mm lens. Within a few minutes, Mars rose farther above the Moon's horizon. Thanks to the parallax effect, the Moon's relative motion in its orbit around Earth appears significantly faster than the movement of Mars in its orbit around the Sun.

Read full article

Comments

SpaceX is superb at reusing boosters, but how about building upper stages?

On any given day, SpaceX is probably launching a Falcon 9 rocket, rolling one out to the launch pad or bringing one back into port. With three active Falcon 9 launch pads and an increasing cadence at the Starbase facility in Texas, SpaceX's teams are often doing all three.

The company achieved another milestone Friday with the 25th successful launch and landing of a single Falcon 9 booster. This rocket, designated B1067, launched a batch of 21 Starlink Internet satellites from Cape Canaveral Space Force Station, Florida.

The rocket's nine kerosene-fueled Merlin 1D engines powered the 21 Starlink satellites into space, then separated from the Falcon 9's upper stage, which accelerated the payload stack into orbit. The 15-story-tall booster returned to a vertical propulsive landing on one of SpaceX's offshore drone ships in the Atlantic Ocean a few hundred miles downrange from Cape Canaveral.

Read full article

Comments

American, Japanese robotic landers share rocket launch to the Moon

Firefly Aerospace’s Blue Ghost lunar lander pictured atop a bespoke payload canister, which encased ispace’s Resilience lunar lander prior to encapsulation inside SpaceX’s Falcon 9 payload fairings. Image: SpaceX

For the first time in lunar exploration, two robotic landers, from two different nations launched to the Moon on one rocket.

But despite Texas-based Firefly Aerospace and Tokyo-based ispace sharing one SpaceX Falcon 9 rocket, the two missions are taking very different paths and timelines to reach the lunar surface.

Liftoff from Launch Complex 39A at NASA’s Kennedy Space Center happened Jan. 15 at 1:11 a.m. EST (0611 UTC). The flight was the 100th orbital launch for SpaceX from the historic pad formerly used by Apollo and the Space Shuttle.

Minutes after deployment, Firefly confirmed acquisition of signal from its lander, officially allowing 45-day trek to the Moon to proceed.

Launch weather officers at the 45th Weather Squadron put the odds of favorable conditions for liftoff at 90 percent, stating that winds could be an issue at launch time.

“Rain showers and overcast conditions will clear the Space Coast by early this afternoon. Wind speeds will decrease throughout the day today,” meteorologists wrote. “By early Wednesday morning and the primary launch window, winds will be at 15-20mph with occasional gusts to 25mph. This will cause a small chance for liftoff winds and a Cumulus Cloud Rule violation.”

SpaceX used the Falcon 9 first stage booster designated B1085 on this mission, which launched for a fifth time. Its previous launches were Crew-9, GPS 3 SV07, Starlink 10-5 and Starlink 6-77.

Nearly 8.5 minutes in to the flight, B1085 landed on the droneship, ‘Just Read the Instructions,’ marking the 107th landing for JRTI and the 398th booster landing to date.

The SpaceX design of the mission patch for the flight of Firefly Aerospace’s Blue Ghost lunar lander and ispace’s Resilience lunar lander. Graphic: SpaceX

‘Ghost Riders in the Sky’

The Wednesday morning launch marked the first Moon-bound mission for Firefly Aerospace. It’s Blue Ghost lunar lander was conceived following the company’s selection as part of NASA’s Commercial Lunar Payload Services (CLPS) Program.

The objective of CLPS is to get NASA science to the surface of the Moon without the agency having to build the landers or procure launches. NASA has multiple contracts with a variety of CLPS providers, with Astrobotic’s Peregrine Mission 1 and Intuitive Machines’ IM-1 flights occurring in early 2024.

Blue Ghost has a dry mass of 469 kg (1,034 lbs) and weighs roughly 1,500 kg (3,300 lbs) when fueled. It uses a combination of MMH hypergolic propellant and MON-3 oxidizer to power the main engine and thrusters during its journey.

It’s designed to carry ten NASA science payloads to the surface of the Moon, which so far is the most manifested on a single lander as part of CLPS.

Firefly Aerospace’s Blue Ghost lunar lander as seen inside a clean room in its fully integrated form. Image: Firefly Aerospace

Joel Kearns, the Deputy Associate Administrator for Exploration within NASA’s Science Mission Directorate, said once the 10 instruments were small enough that they could fly on one lander, the agency looked for a company that could execute on all the science operations over 14 days (one lunar daylight period).

“Firefly and several other bidders took up that challenge. They’ve come up with a really credible mission plan to conduct all the experiments we want on our instruments,” Kearns said.

In a prelaunch interview with Spaceflight Now, Brigette Oakes, the vice president of Engineering for Firefly, said the company incorporated learnings from previous lunar missions.

“We really also took a lot of lessons learned from previous missions. I mean, we did a full, thorough review of every lunar mission that went up, whether it was commercial or NASA and took a lot of lessons learned from that and then essentially just kind of fine tuned and adapted for Firefly’s model with the additional product lines and then took the best of what previous companies have done before us.”

Firefly also took learnings and hardware from its Alpha rocket and folded those into Blue Ghost as well.

“There’s a lot of great wisdom and experience and lessons learned at this company. We have rockets and satellites at our company. So, there’s a lot of commonality between the two different parts of our company and there’s a lot of lessons learned that get shared,” said Firefly CEO Jason Kim.

“As we go to cadence on our Alpha rocket, a lot of those lessons learned, even the reaction control propulsion, that’s stuff that’s lessons learned for our Blue Ghost lander because we have ACS and RCS thrusters on our Blue Ghost lander that have heritage from the Alpha rocket. So, there’s a lot of crosstalk within our company. So that really helps programs, like Blue Ghost have confidence.”

Firefly Aerospace’s Alpha FLTA005 rocket stands at Space Launch Complex 2 (SLC-2) in support of the “Noise of Summer” mission. Image: Firefly Aerospace / Sean Parker

As Firefly goes in for its first landing attempt, set to take place on March 2, Kim said one of the key tools on this lander is a quartet of cup-shaped ends on the landing legs.

“Those landing pads are designed carefully with crumple zones,” he said. “If you think of honeycomb and how crunchy it is, it’s got that built into the actual structure. And so, when it lands, it’s going to – kind of like you’re car when you get into an accident – it crumples deliberately. That’s what that design entails.”

The mission, called ‘Ghost Riders in the Sky,’ will take slightly longer to reach the surface of the Moon, compared to the last CLPS mission from Houston-based Intuitive Machines. The IM-1 flight took about seven days from liftoff to landing, while the Blue Ghost lander is taking roughly 45 days to make its journey.

Once on the surface, it will operate for about two weeks with instruments including a sample collection tool called the Lunar PlanetVac (LPV) from Honeybee Robotics; a navigational demonstration called the Lunar GNSS Receiver Experiment (LuGRE) from the Italian Space Agency and NASA Goddard Space Flight Center; and the Regolith Adherence Characterization (RAC) from Aegis Aerospace, which will study how lunar regolith sticks to a variety of materials.

The lander is also designed to survive for a few hours in the lunar night to capture sunset and other data in lunar darkness.

‘Never Quit the Lunar Quest’

Beneath the Blue Ghost lunar lander, inside a specially designed payload canister, was ispace’s lander called Resilience. This was the second time the Japan-based part of the company launches a lander to the Moon.

Its first launch attempt, Hakuto-R Mission 1 (M1) launched as a dedicated flight on a Falcon 9 in December 2022 and made a failed landing attempt in April 2023.

In a prelaunch interview with Spaceflight Now, former NASA Astronaut and current CEO of ispace-US, Ron Garan, said it was a software glitch that prevented the first landing. He said the radar altimeter saw a big jump in altitude as they approached the crater they were aiming for, which caused the lander to misinterpret where it was in the mission profile.

It then made what it thought was a soft landing, but was actually about 5,000 meters above the bottom of the crater and hovered there until it ran out of fuel and crashed.

“We’ve obviously fixed all that software, we’re not landing in the bottom of a deep crater this time and so, our confidence level is a lot higher on this one,” Garan said.

Japan-based ispace’s Resilience lunar lander, pictured in a clean room prior to the launch of the Hakuto-R Mission 2 flight. Image: ispace

For Hakuto-R Mission 2, with the mission name ‘Never Quit the Lunar Quest,’ the Resilience lander will target a touchdown in a region called Mare Frigoris – the ‘Sea of Cold’ – which lies in the northern part of the Moon.

The mission will take considerably longer to reach the Moon than Firefly’s Blue Ghost. While Firefly’s lander will be dropped off in a highly elliptical Earth orbit and take 25 days for a phased orbital approach before performing a translunar injection burn, Resilience will take a slower path to the Moon using the upper stage of the Falcon 9 rocket to put it on a path for a low-energy transfer to the Moon.

Essentially, it will do a flyby of the Moon, go out about a million miles into deep space and then synch up with the Moon again for its landing.

“What the low-energy transfer allows is us to trade fuel for payload capacity margin,” Garan explained. “It just leads to more capacity for us to bring to the lunar surface.”

The lander carries with it several science instrument, including an a food production experiment and one designed to demonstrate electrolysis.

“The electrolysis is really exciting because of the implications. If we’re able to really do electrolysis on the Moon, then we’re able to produce rocket fuel on the Moon,” Garan said.

The Tenacity micro rover will fly alongside ispace’s Resilience rover during its journey to the Moon. Image: ispace

The mission will also take a small rover, called Tenacity, which will be deployed to operate on its own after landing. It features an HD camera that will be used to capture, among other things, imagery of an art installation called the ‘Moon House,’ which is a replica of a Swedish home that will be placed on the surface.

Garan said the rover comes from the European division of ispace.

“The rover itself is really critical to the future of our company. That the rover is efficient and the data that’s going to come off the rover is going to be really valuable to us as we continue to hone our design on the surface mobility aspect of the business,” Garan said. “And so, that’s really exciting too.”

Both the rover and the lander will operate on the surface of the Moon for about two weeks when the Moon slips into lunar nighttime. Garan said they are looking at a variety of methods for how to potentially achieve this, from orbiting solar concepts to nuclear options and beyond.

“To start a cislunar economy, you have to be able to survive the night. There’s millions and millions and millions of dollars that are put into these missions and if they only operate for two weeks, that’s not a very good return on investment,” Garan said. “So we want to be able to do surface operations for moths or years at a time and in order to do that, you have to be able to survive the night.”

Wednesday: CPI, NY Fed Mfg, Beige Book

Mortgage Rates Note: Mortgage rates are from MortgageNewsDaily.com and are for top tier scenarios.

Wednesday:
• At 7:00 AM ET, The Mortgage Bankers Association (MBA) will release the results for the mortgage purchase applications index.

• At 8:30 AM, The Consumer Price Index for December from the BLS. The consensus is for 0.3% increase in CPI, and a 0.2% increase in core CPI.  The consensus is for CPI to be up 2.9% year-over-year and core CPI to be up 3.3% YoY.

• At 8:30 AM, The New York Fed Empire State manufacturing survey for January. The consensus is for a reading of -2.0, down from 0.2.

• At 2:00 PM, the Federal Reserve Beige Book, an informal review by the Federal Reserve Banks of current economic conditions in their Districts.

Tuesday 14 January 1661/62

All the morning at home, Mr. Berkenshaw by appointment yesterday coming to me, and begun composition of musique, and he being gone I to settle my papers and things in my chamber, and so after dinner in the afternoon to the office, and thence to my chamber about several businesses of the office and my own, and then to supper and to bed. This day my brave vellum covers to keep pictures in, come in, which pleases me very much.

Read the annotations

Planet Population around Orange Dwarfs

Planet Population around Orange Dwarfs

Last Friday’s post on K-dwarfs as home to what researchers have taken to calling ‘superhabitable’ worlds has caught the eye of Dave Moore, a long-time Centauri Dreams correspondent and author. Readers will recall his deep dives into habitability concepts in such essays as The “Habitability” of Worlds and Super Earths/Hycean Worlds, not to mention his work on SETI (see If Loud Aliens Explain Human Earliness, Quiet Aliens Are Also Rare). Dave sent this in as a comment but I asked him to post it at the top because it so directly addresses the topic of habitability prospects around K-dwarfs, based on a quick survey of known planetary systems. It’s a back of the envelope overview, but one that implies habitable planets around stars like these may be more difficult to find than we think.

by Dave Moore

To see whether K dwarfs made a good target for habitable planets, I decided to look into the prevalence and type of planets around K dwarfs and got carried away looking at the specs for 500 systems of dwarfs between 0.6 mass of the sun and 0.88.

Some points:

i) This was a quick and dirty survey.

ii) Our sampling of planets is horribly skewed towards the massive and close, but that being said, we can tell if certain types of planets are not in a system. For instance Jupiter and Neptune sized planets at approximately 1 au show up, so if a system doesn’t show them after a thorough examination, it won’t have them.

iii) I had trouble finding a planet list that was configurable to my needs. I finally settled on the Exoplanets Data Explorer configured in reverse order of stellar mass. This list is not as comprehensive as the Exosolar Planetary Encyclopedia.

iv) I concocted a rough table of the inner and outer HZ for the various classes of K dwarfs. Their HZs vary considerably. A K8 star’s HZ is between 0.26 au and 0.38 au while a K0’s HZ is between 0.72 au and 1.04 au. This means that you can have two planets orbiting at the same distance around a star and one I will classify as outside the HZ and the other inside the HZ.

v) Planets below 9 Earth mass I classified as Super-Earth/Sub-Neptune. Planets between 9 Earth masses and 30 are classified as Neptunes. Planets over that size are classified as Jupiters.

Image: An array of planets that could support life are shown in this artist’s impression. How many such worlds orbit K-dwarf stars, and are any of them likely to be ‘superhabitable’? Credit: NASA, ESA and G. Bacon (STScI).

What did I find:

By far the most common type are hot Super-Earths/Sub-Neptunes (SE/SNs). These are planets between 3 EM (Earth mass) and 6 EM. It is amazing the consistency of size these planets have. They are mostly in close (sub 10 day) orbits. There also appears to be a subtype of sub 2EM planets in very tight orbits (some quoted in hours) and given some of these were in multi-planet systems of SE/SNs, I would say these were SE/SNs, which have been evaporated down to their cores.

I also found 7 in the HZ and 2 outside the HZ.

I found 52 hot Jupiters and what I classified as 43 elliptical orbit Jupiters. These were Jupiter-sized planets in elliptical orbits under 3 au.

There were also 10 Jupiter classification planets in circular orbits under 3 au. and 3 outside that limit in what could be thought of as a rough analog of our system.

There were also 46 hot Neptunes and 14 in circular orbits further out, only one outside the habitable zone.

Trends:

At the lower mass end of the scale, K dwarf systems start off looking very much like M dwarfs except that everything, even those in multi-planet systems, is inside the habitable zone.

As you work your way up the mass scale, there is a slight increase in the average mass of the SE/SNs with 7-8 EM planets becoming more prevalent. More and more Jupiters appear, and Neptune-sized planets appear and become much more frequent. Also, you get the occasional monster system of tightly packed Jupiters and Neptunes like 55 Cancri.

An interesting development begins at about the mid mass range. You start getting SE/SNs in nice circular longer period orbits but still inside the HZ (28 in 20-100d orbits.)

Conclusions:

If we look at the TRAPPIST-1 system around an M-dwarf, its high percentage of volatiles (20% water/ice) implies that there is a lot of migration in from the outer system. If a planet has migrated in from outside the snow line, then there’s a good chance that even if it’s in the habitable zone, it will be a deep ocean planet.

Signs of migration are not hard to find. Turning back to the K-dwarfs, if we look at the Jupiters, only three show signs of little migration (analogs of our system). Ten migrated in smoothly but sit at a distance likely to have disrupted a habitable planet. Forty-three are in elliptical orbits, which are considered signs of planet-planet scattering.

Hot Jupiters can be accounted for by either extreme scattering or migration. As to inward migration, Martin Fogg did a series of papers showing that as Jupiter mass planets march inwards they scatter protoplanets, but these can reform behind the giant, and so Earth-like planets may occur outside of the hot Jupiter.

Neptunes in longer period circular orbits and the longer period SE/SNs all point to migration. These last groups are intriguing as they point to a stable system with the possibility of smaller planets further out. I would include the 7 planets in the habitable zone in this group. But if these planets all migrated inwards they may well be ocean planets.

K dwarfs have an interesting variety of systems, so they’d be useful to study, but I don’t see them as the major source of Earth analogs—at least not until we learn more.

Links 1/14/25

Links for you. Science:

Adaptation in the Alleyways: Candidate genes under potential selection in urban coyotes
Inferring the demographic history of aye-ayes (Daubentonia madagascariensis) from high-quality, whole-genome, population-level data
Not the “Chinese flu” label thing again…please
System to auto-detect new variants will inform better response to future infectious disease outbreaks (paper here)
Canadian teen with bird flu was on life support, new report reveals
COVID 5 years later: Learning from a pandemic many are forgetting

Other:

Poke The Bear: Donald Trump is a lame duck (not a bear)—an unusually weak one, in some ways—and he knows it.
I’m the Governor of Hawaii. I’ve Seen What Vaccine Skepticism Can Do.
Mark Zuckerberg lies about content moderation to Joe Rogan’s face: The head of Meta cracked under Republican pressure.
Did DOJ Officials Try to Sway the 2020 Election for Trump? A new report by the Inspector General found that three senior officials improperly shared details with the media about plans to collect data on Covid-19 deaths in nursing homes in states with Democratic governors.
Winning coalitions aren’t always governing coalitions
Heart of Zuckness: New slur pack unlocked on Facechan.
President Dragon wants more gold
Opening the DNC’s Black Box: Why we’re publishing a previously undisclosed list of all 448 members of the Democratic National Committee
The real reason that Trump won
Jesus H. Christ on an H-1B Visa, This Immigration Bill Is Unfortunately Moving Ahead. And Ken Paxton may soon be able to sue the federal government!
Texas’ War on Drug Users: A mass overdose event in Austin reveals the state’s backward approach to the ongoing crisis spurred by fentanyl and other super-potent substances.
The Coming Assault on Birthright Citizenship: The Constitution is absolutely clear on this point, but will that matter?
The Simple Truth About Trump’s Felony Sentencing. It was a rueful moment for many reasons, and of course the president-elect wallowed in self-pity.
9 predictions for Trump’s second term
RFK Jr. faces fresh scrutiny over alleged ties to deadly measles outbreak
State to probe why Pacific Palisades reservoir was offline, empty when firestorm exploded
Over 17,000 doctors warn Senate: RFK Jr. is ‘actively dangerous’
Massachusetts Road Signs
Congressional Report Accuses Jordan, Musk Of Weaponizing Gov’t To Silence Critics
I Am a Passionate Mid-Level University Administrator, and I’m Gonna Administrate the Shit Out of This Place
Did power lines help start the L.A. fires? What we know.
L.A. fire chief meets with mayor after saying the city failed her agency
Apple CEO Pay Rises 18%; Company Opposes Anti-Diversity Measure
We’ve Never Been Here Before: The Zero-Accountability Presidency
Their houses burned down. Now, they are fighting for the few homes left on the market
Kim Jong Un Just Banned Hot Dogs in North Korea

Catching memory leaks with your test suite

Resource leaks are an unpleasant type of bug. Little by little your program uses more memory, or more file descriptors, or some other limited resource. Everything seems fine—until you run, and now your program is dead.

In many cases you can catch these sort of bugs in advance, by tweaking your test suite. Or, after you’ve discovered such a bug, you can use your test suite to identify what is causing it. In this article we’ll cover:

  • An example of a memory leaks.
  • When your test suite may be a good way to identify the causes of leaks.
  • How to catch leaks using pytest.
  • Other types of leaks.
Read more...

Two Bank Failures in 2024

There were four bank failures in 2024. The median number of failures since the FDIC was established in 1933 was 7 - so 2 failures in 2024 was below the median.

There were five bank failures in 2023, however 3 of the failures were larger banks: First Republic Bank, San Francisco, CA, Signature Bank, New York, NY, Silicon Valley Bank, Santa Clara, CA.

The first graph shows the number of bank failures per year since the FDIC was founded in 1933.

FDIC Bank Failures Click on graph for larger image.

Typically about 7 banks fail per year.

Note: There were a large number of failures in the '80s and early '90s. Many of these failures were related to loose lending, especially for commercial real estate.  Also, a large number of the failures in the '80s and '90s were in Texas with loose regulation.

Even though there were more failures in the '80s and early '90s than during the financial crisis, the financial crisis was much worse (larger banks failed and were bailed out).

Pre-FDIC Bank Failures The second graph includes pre-FDIC failures. In a typical year - before the Depression - 500 banks would fail and the depositors would lose a large portion of their savings.

Then, during the Depression, thousands of banks failed. Note that the S&L crisis and recent financial crisis look small on this graph.

LA musicians and the fires

A huge number of Los Angeles-based musicians have lost their homes in the Palisades and Eaton fires. Among them are the oboist Marion Kuszyk, the trumpeter Christopher Still, and the violinist Aroussiak Baltaian, all members of the LA Philharmonic. A fundraiser for Baltaian has been organized here. Four members of the UCLA faculty – Suzy Hertzberg, Ray Ingersoll, Jens Lindeman, and Peter Golub — have also lost their homes and studios. An ever-expanding list of musicians and composers affected by the fires, among them the noted experimental composer and saxophonist Steve Lehman, can be found on this page; fundraising links are available in many cases. The Trade School, a community arts center that opened this fall and survived the Eaton Fire, has launched an Altadena Mutual Aid page.

Comet ATLAS is really bright now, but also really close to the Sun. Comet ATLAS is really bright now, but also really close to the Sun.


Some game theory of Greenland

It is commonly assumed that the U.S. “acquiring” Greenland, whatever that might mean, will result in greater U.S. control of the territory.  Along some dimensions that is likely.  But it is worth pondering the equilibrium here more seriously.

I observe, in many locations around the world, that indigenous groups end up with far more bargaining power than their initial material resources might suggest. For instance, in the United States Native Americans often (not always) can exercise true sovereignty.  The AARP cannot (yet?) say the same.  In Mexico, indigenous groups have blocked many an infrastructure project.

One reason for these powers is that, feeling outmatched, the indigenous groups cultivate a temperament of “orneriness” and “being difficult.”  Some of that may be a deliberate strategic stance, some of it may be heritage from having been treated badly in the past and still lacking trust, and some of it may, over time, be acquired culture as the strategic stance gets baked into norms and behavior patterns.

Often, in these equilibria, the more nominal power you have over the indigenous group, the more orneriness they will have to cultivate.  If you only want a few major concessions, sometimes you can get those better as an outsider.  A simple analogy is that sometimes a teenager will do more to obey a grandparent than a parent.  Fewer issues of control are at stake, and so more concessions are possible, without fear of losing broader autonomy.

So a greater American stake in Greenland, however that comes about, may in some regards end up being counterproductive.  And these factors will become more relevant as more resource and revenue control issues come to the table.  For some issues it may be more useful having Denmark available as “the baddie.”

It is worth thinking through these questions in greater detail.

The post Some game theory of Greenland appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Tuesday assorted links

1. Stephen Miran makes the case for twenty percent tariffs (WSJ).  You can run this one through o1 pro yourself.

2. The history of export controls on IP.

3. Jacob Trefethen, on reforming science funding.

4. In ex-Soviet States, the mental health of the young seems fine.

5. New Ashlee Vance sci-tech media venture.  Site here.

6. How igloos keep people warm.

7. Might Mark Carney replace Justin Trudeau?

The post Tuesday assorted links appeared first on Marginal REVOLUTION.

       

Comments

 

SpaceX to launches 131 spacecraft on Transporter-12 Falcon 9 rideshare mission from California

A Falcon 9 rocket roars away from Vandenberg Space Force Base on Jan. 14, 2025, on the Transporter 12 mission. Image: Space X.

SpaceX launched 131 payloads Tuesday onboard the company’s 12th smallsat rideshare mission to date.

The Transporter-12 mission flew onboard a Falcon 9 rocket lifting off from pad 4E at Vandenberg Space Force Base at 11:09 a.m. PST (2:09 p.m. EST, 1909 UTC).

The Falcon 9 booster used on Tuesday’s flight, tail number B1088 in the SpaceX fleet, was launching for a second time. It previously launched the NROL-126 mission, which was a combination of 20 Starlink V2 Mini satellites and an undisclosed number of Starshield satellites for the National Reconnaissance Office.

About 7.5 minutes after liftoff, B1088 touched down at Landing Zone 4, located near the launch pad. It marked the 23rd booster landing at LZ-4 and the 397th booster landing to date.

Exolaunch technicians integrate a payload for Germany-based research university, Technische Universität Berlin (TU Berlin). They are one of dozens of customers flying onboard SpaceX’s Transporter-12 smallsat rideshare mission. Image: Exolaunch

The dozens of payloads flying were from a myriad of customers from around the world, from research institutes and aerospace companies to other governments’ space agencies, like the United Arab Emirates’ Mohammad Bin Rashid Space Centre.

The MBRSC’s MBZ-SAT, an Earth-observation satellite, is named after the country’s president, H.H. Sheikh Mohamed Bin Zayed Al Nahyan. The agency said the satellite is designed to process and transmit images within two hours to “provide insights for applications such as environmental monitoring, disaster relief, and infrastructure management, enabling decision-makers to act swiftly and effectively.”

Other Earth-observing satellites include Planet Labs PBC’s high-resolution Pelican-2 satellite along with 36 of its SuperDoves.

“This Pelican satellite is designed to provide up to 40 cm class resolution imagery across 6 multispectral bands optimized for cross-sensor analysis,” Planet wrote in a pre-launch statement. “Additionally, Planet has collaborated with NVIDIA to equip Pelican-2 with the NVIDIA Jetson platform to power on-orbit computing—with the aim of vastly reducing the time between data capture and value for customers.”

A rendering of the Pelican-2 satellite. Graphic: Planet

Also hitching a ride to space are Spire Global’s two Low Earth Multi-Use Receiver (LEMUR) 3 satellites along with four other satellites. The LEMUR 3 CubeSats are designed to help improve weather forecasting, maritime monitoring and augment Internet-of-Things connectivity, according to the company.

“Two LEMUR 3 satellites, developed in collaboration with Myriota, will expand global IoT coverage with cutting-edge direct-to-orbit communications, enabling Myriota’s IoT solutions to operate seamlessly and more effectively,” Spire said in a pre-launch statement. “This network will enhance connectivity for critical sectors such as agriculture, defense, and logistics across regions like the US, Europe, and Latin America, promoting sustainability and efficient resource management.”

Spire’s satellites are among the 35 satellites being flown by satellite deployment and hosting company, Exolaunch.

“Transporter-11 was a landmark mission for us, and as we look ahead to Transporter-12, we’re excited to keep the momentum going,” said Robert Sproles, Exolaunch CEO, in a pre-launch statement. “We deeply appreciate the trust our customers place in us and extend our thanks to SpaceX for their outstanding support.

“Our long-standing partnership with SpaceX has been a cornerstone of Exolaunch’s growth, and it’s an honor to be part of every Transporter mission.”

Satellite deployment began about 54 minutes after liftoff with the deployment of the GESat and GEN1 satellites by Exolaunch and concluded more than two hours and 22 minutes after liftoff with the deployment of Firefly-2 by Pixxel, also via Exolaunch.

Part 1: Current State of the Housing Market; Overview for mid-January 2025

Today, in the Calculated Risk Real Estate Newsletter: Part 1: Current State of the Housing Market; Overview for mid-January 2025

A brief excerpt:
This 2-part overview for mid-January provides a snapshot of the current housing market.

I always focus first on inventory, since inventory usually tells the tale! I’m watching months-of-supply closely.
...
New home inventory, as a percentage of total inventory, is still very high. The following graph uses Not Seasonally Adjusted (NSA) existing home inventory from the National Association of Realtors® (NAR) and new home inventory from the Census Bureau (only completed and under construction inventory).

New vs existing Inventoryt took a number of years following the housing bust for new home inventory to return to the pre-bubble percent of total inventory. Then, with the pandemic, existing home inventory collapsed and now the percent of new homes is 22.4% of the total for sale inventory, down from a peak of 27.2% in December 2022.

The percent of new homes of total inventory should continue to decline as existing home inventory increases. However, the percent of new home inventory will increase seasonally over the Winter as existing homes are withdrawn from the market.
There is much more in the article.

Links 1/13/25

Links for you. Science:

Parents question federal response after son’s near-death encounter with E. coli (mostly of interest because most people don’t realize the U.S. does have longstanding successful surveillance programs)
The Caribbean has been unusually warm. That’s not a good thing.
Surgeon general calls for alcohol to carry cancer warning
More children are getting kidney stones. Experts think it’s their diet.
How much alcohol is safe to drink?
Multi-omic profiling a defined bacterial consortium for treatment of recurrent Clostridioides difficile infection

Other:

Inside Trump’s Search for a Health Threat to Justify His Immigration Crackdown. President-elect Donald J. Trump’s advisers have spent months trying to identify a disease that will help them build their case for closing the border. (but why don’t people trust public health officials something something)
The Democratic Party’s Blue MAGA Problem: Acknowledge It And Fix It
The Year Democrats Lost the Internet
Kash Patel Believes the FBI Planned Jan. 6th
How to take on the billionaires — and win
Why Mike Johnson’s fake “Jefferson prayer” matters
How Crazy Was The Las Vegas Cybertruck Bomber?
The R-Word’s Comeback Is a Grim Sign of Our Political Moment
As academic Bluesky grows, researchers find strengths—and shortcomings
Infamous ‘Pizzagate’ gunman shot dead after pulling gun during traffic stop
Facebook Is Censoring 404 Media Stories About Facebook’s Censorship
Donald Trump Jr.’s Ridiculous Greenland Trip Just Took a Dark Turn. It appears the whole thing was staged.
The Crisis of Gender Relations
University of West Florida braces for leadership changes (“He concluded the talk by suggesting that “the effort to erase the old standard of public men and private woman has been a mistake.””)
The New Rasputins: Anti-science mysticism is enabling autocracy around the globe.
Anita Bryant Dead. The LGBTQ community outlives another one.
Creation Museum/Ark Encounter Ex-Staffer And Church Worship Leader Arrested On Child Sex Charges
RFK isn’t big enough for a stadium with NFL-sized parking
California’s Fires Show How Climate Will Destabilize Our Politics and Daily Life
Oligarch farmers and the fires in Los Angeles
‘Entirely foreseeable’: The L.A. fires are the worst-case scenario experts feared
As a Climate Scientist, I Knew It Was Time to Leave Los Angeles
“Little Fish”
Meta Deletes Trans and Nonbinary Messenger Themes
Elon Musk is the new emperor of misinformation
Where the Left Went Wrong on Homelessness

CPI Preview

The Consumer Price Index for December is scheduled to be released tomorrow. The consensus is for 0.3% increase in CPI, and a 0.2% increase in core CPI. The consensus is for CPI to be up 2.9% year-over-year and core CPI to be up 3.3% YoY.

From Goldman Sachs economists:
We expect a 0.25% increase in December core CPI (vs. 0.2% consensus), corresponding to a year-over-year rate of 3.27% (vs. 3.3% consensus). We expect a 0.40% increase in December headline CPI (vs. 0.3% consensus), reflecting 0.35% higher food prices and 2.3% higher energy prices. Our forecast is consistent with a 0.21% increase in CPI core services excluding rent and owners’ equivalent rent and with a 0.18% increase in core PCE in December.
From BofA:
We forecast both headline and core CPI inflation to remain at 0.3% m/m in December, although there is a risk that the core could round down to 0.2%. The y/y rate should tick up a tenth to 2.8% for headline and remain unchanged at 3.3% for core.

Organ transplant ban by Taliban in Afghanistan (BBC)

 Afshin Nikzad forwards me this report from  the BBC Persian service (in Farsi, but automatically translated by Chrome):

Kidney transplant halted in Afghanistan  by Sajjad Mohammadi

"A number of private hospitals in Kabul and Herat told the BBC that the Taliban government has banned kidney transplants in Afghan hospitals for a month now.

"The Taliban government's Ministry of Encouraging Good and Forbidding Evil announced about a month ago that, according to the seventh paragraph of Article 18 of the ministry's law, the sale and use of human body parts such as kidneys, liver, eyes, and hair is prohibited.

"The ministry said: "The basis and purpose of this decision is to preserve human dignity and respect, and the human body has special sanctity, and its organs should not under any circumstances be used as a means of commercialization or profiteering."

Technology Trends for 2025

Welcome to our annual report on the usage of the O’Reilly learning platform. It’s been an exciting year, dominated by a constant stream of breakthroughs and announcements in AI, and complicated by industry-wide layoffs. Generative AI gets better and better—but that trend may be at an end. Now the ball is in the application developers’ court: Where, when, and how will AI be integrated into the applications we build and use every day? And if AI replaces the developers, who will be left to do the integration? Our data shows how our users are reacting to changes in the industry: Which skills do they need to brush up on? Which do they need to add? What do they need to know to do their day-to-day work? In short: Where have we been in the past year, and where are we going?

We aren’t concerned about AI taking away software developers’ jobs. Ever since the computer industry got started in the 1950s, software developers have built tools to help them write software. AI is just another tool, another link added to the end of that chain. Software developers are excited by tools like GitHub Copilot, Cursor, and other coding assistants that make them more productive.

That’s only one of the stories we’re following. Here are a few of the others:

  • The next wave of AI development will be building agents: software that can plan and execute complex actions.
  • There seems to be less interest in learning about programming languages, Rust being a significant exception. Is that because our users are willing to let AI “learn” the details of languages and libraries for them? That might be a career mistake.
  • Security is finally being taken seriously. CEOs are tired of being in the news for the wrong reasons. AI tools are starting to take the load off of security specialists, helping them to get out of “firefighting” mode.
  • “The cloud” has reached saturation, at least as a skill our users are studying. We don’t see a surge in “repatriation,” though there is a constant ebb and flow of data and applications to and from cloud providers.
  • Professional development is very much of interest to our users. Specifically, they’re focused on being better communicators and leading engineering teams.

All of these trends have been impacted, if not driven, by AI—and that impact will continue in the coming year.

Finally, some notes about methodology. Skip this paragraph if you want; we don’t mind. This report is based on the use of O’Reilly’s online learning platform from January 1, 2024, to September 30, 2024. Year-over-year comparisons are based on the same period in 2023. The data in each graph is based on O’Reilly’s “units viewed” metric, which measures the actual use of each item on the platform. It accounts for different usage behavior for different media: text, courses, and quizzes. In each graph, the data is scaled so that the item with the greatest units viewed is 1. That means items within a graph are comparable to each other, but you can’t compare an item in one graph to an item in another. And all percentages are reported with two significant digits.

Skills

When we look at how our customers use the O’Reilly learning platform, we always think in terms of skills. What skills are they trying to gain? And how are they trying to improve their knowledge? This year, one thread that we see across all of our platform is the importance of artificial intelligence. It’s all about upskilling in the age of AI.

Artificial Intelligence

It will surprise absolutely nobody that AI was the most active category in the past year. For the past two years, large models have dominated the news. That trend started with ChatGPT and its descendants, most recently GPT 4o1. But unlike 2022, when ChatGPT was the only show anyone cared about, we now have many contenders. Claude has emerged as a favorite among programmers. After a shaky start, Google’s Gemini models have become solid performers. Llama has established itself as one of the top models and as the matriarch of a rich ecosystem of open1 models. Many of the open models can deliver acceptable performance when running on laptops and phones; some are even targeted at embedded devices.

So what does our data show? First, interest in almost all of the top skills is up: From 2023 to 2024, Machine Learning grew 9.2%; Artificial Intelligence grew 190%; Natural Language Processing grew 39%; Generative AI grew 289%; AI Principles grew 386%; and Prompt Engineering grew 456%. Among the top topics, the most significant decline was for GPT itself, which dropped by 13%—not a huge decline but certainly a significant one. Searches for GPT peaked in March 2023 and have been trending downward ever since, so our search data matches our usage data.

We’re used to seeing interest move from a more general high-level topic to specific skills as an industry sector matures, so this trend away from GPT in favor of more abstract, high-level topics is counterintuitive. But in context, it’s fairly clear what happened. For all practical purposes, GPT was the only game in town back in 2023. The situation is different now: There’s lots of competition. These other models don’t yet show up significantly in search or usage data, but the users of our platform have figured out what’s important: not learning about GPT or Claude or Gemini or Mistral but getting the background you need to make sense of any model. Discovering a workflow that fits your needs is important, and as Simon Willison points out, your ideal workflow may actually involve using several models. Recent models are all good, but they aren’t all good in the same way.

AI has had a great year, but will it continue to show gains in 2025? Or will it drop back, much as ChatGPT and GPT did? That depends on many factors. Gartner has generative AI slipping into the “trough of disillusionment”—and whatever you think of the technology’s promise, remember that the disillusionment is a sociological phenomenon, not a technical one, and that it happens because new technologies are overhyped. Regardless of generative AI’s long-term promise, we expect some disillusionment to set in, especially among those who haven’t properly understood the technology or its capabilities.

Prompt Engineering, which gained 456% from 2023 to 2024, stands out. A 456% gain isn’t as surprising as it seems; after all, people only started talking about prompt engineering in 2023. Although “prompt engineering” was bandied about as a buzzword, it didn’t become a skill that employers were looking for until late in 2023, if that. That may be an early warning signal for AI disillusionment. Searches for “prompt engineering” grew sharply in 2023 but appeared to decline slightly in 2024. Is that noise or signal? If disillusionment in Prompt Engineering sets in, we’ll also see declines in higher-level topics like Machine Learning and Artificial Intelligence.

There’s a different take on the future of prompt engineering. There have been a number of arguments that the need for prompt engineering is temporary. As generative AI improves, this line of reasoning contends, we will no longer need to write complex prompts that specify exactly what we want the AI to do and how to do it. Prompts will be less sensitive to exactly how they’re worded; changing a word or two will no longer give a completely different result. We’ll no longer have to say “explain it to me as if I were five years old” or provide several examples of how to solve a problem step-by-step.

Some recent developments point in that direction. Several of the more advanced models have made the “explain it to me” prompts superfluous. OpenAI’s GPT 4o1 has been trained in a way that maximizes its problem-solving abilities, not just its ability to string together coherent words. At its best, it eliminates the need to write prompts that demonstrate how to solve the problem (a technique called few-shot prompting). At worst, it “decides” on an inappropriate process, and it’s difficult to convince it to solve the problem a different way. Anthropic’s Claude has a new (beta) computer use feature that lets the model use browsers, shells, and other programs: It can click on links and buttons, select text, and do much more. (Google and OpenAI are reportedly working on similar features.) Enabling a model to use the computer in much the same way as a human appears to give it the ability to solve multistep problems on its own, with minimal description. It’s a big step toward a future full of intelligent agents: linked AI systems that cooperate to solve complex problems. However, Anthropic’s documentation is full of warnings about serious security vulnerabilities that remain to be solved. We’re thrilled that Anthropic has been forthright about these weaknesses. But still, while computer use may be a peek at the future, it’s not ready for prime time.

AI will almost certainly slide into a trough of disillusionment; as I’ve said, the trough has more to do with sociology than with technology. But OpenAI and Anthropic are demonstrating important paths forward. Will these experiments bear fruit in the next year? We’ll see.

Artificial intelligence

Many skills associated with AI also showed solid gains. Use of content about Deep Learning is up 14%, Generative Models is up 26%, and GitHub Copilot is up 471%. Use of content about the major AI libraries was up slightly: PyTorch gained 6.9%, Keras increased 3.3%, and Scikit-Learn gained 1.7%. Usage of TensorFlow content declined 28%; its continued decline indicates that PyTorch has won the hearts and minds of AI developers.

These gains—particularly Copilot’s—are impressive, but a more important story concerns two skills that came out of nowhere: Usage of content about LangChain is on a par with PyTorch, and RAG is on a par with Keras. Neither of these skills were in last year’s report; in 2023, content usage for LangChain and RAG was minimal, largely because little content existed. They’ve caught on because both LangChain and RAG are tools for building better applications on top of AI models. GPT, Claude, Gemini, and Llama aren’t the end of the road. RAG lets you build applications that send private data to a model as part of the prompt, enabling the model to build answers from data that wasn’t in its training set. This process has several important consequences: It minimizes the probability of error or “hallucination”; it makes it possible to attribute answers to the sources from which they came; and it often makes it possible to use a much smaller and more economical model.

LangChain is the first of many frameworks for building AI agents. (OpenAI has Swarm; Google has an Agent Builder that’s part of Vertex; Salesforce and other vendors also have offerings.) Agents are software that can plan and execute multistage actions, many of which are delegated to other AI models. Claude’s computer use API is another facet of this trend, along with whatever products OpenAI and Google may be building. Saying that usage has increased 26 million percent isn’t to the point—but realizing that LangChain has grown from near zero to a platform on a par with PyTorch is very much so. Agentic applications are certainly the next big trend within AI.

Skills needed for AI

Data

Artificial intelligence relies heavily on what we used to call (and perhaps still call) data science. Building AI models requires data at unprecedented scale. Building applications with RAG requires a portfolio of data (company financials, customer data, data purchased from other sources) that can be used to build queries, and data scientists know how to work with data at scale.

Therefore, it’s not surprising that Data Engineering skills showed a solid 29% increase from 2023 to 2024. SQL, the common language of all database work, is up 3.2%; Power BI was up 3.0%, along with the more general (and much smaller) topic Business Intelligence (up 5.0%). PostgreSQL is close to edging ahead of MySQL, with a 3.6% gain. Interest in Data Lake architectures rose 59%, while the much older Data Warehouse held steady, with a 0.3% decline. (In our skill taxonomy, Data Lake includes Data Lakehouse, a data storage architecture that combines features of data lakes and data warehouses.) Finally, ETL grew 102%. With the exception of ETL, the gains are smaller than the increases we saw for AI skills, but that makes sense: AI is an exciting new area, and data is a mature, stable category. The number of people who need specialized skills like ETL is relatively small but obviously growing as data storage becomes even more important with AI.

It’s worth understanding the connection between data engineering, data lakes, and data lakehouses. Data engineers build the infrastructure to collect, store, and analyze data. The data needed for an AI application almost always takes many forms: free-form text, images, audio, structured data (for example, financial statements), etc. Data often arrives in streams, asynchronously and more or less constantly. This is a good match for a data lake, which stores data regardless of structure for use later. Because data receives only minimal processing when it arrives, it can be stored in near real time; it’s cleaned and formatted in application-specific ways when it’s needed. Once data has been stored in a data lake, it can be used for traditional business analytics, stored in a vector or graph database for RAG, or put to almost any other use. A data lakehouse combines both structured and unstructured data in a single platform.

Data analysis (including databases)

Software Development

What do software developers do all day? They write software. Programming is an important part of the job, but it’s not the whole thing; best estimates are that programmers spend roughly 20% of their time writing code. The rest of their time is spent understanding the problems they’re being asked to solve, designing appropriate solutions, documenting their work, updating management on the status of their projects, and much more.

Software architecture, which focuses on understanding a customer’s requirements and designing systems to meet those requirements, is an important part of the overall software development picture. It’s a skill to which many of our software developers and programmers aspire.

Architecture

This year’s data shows that software architecture continues to be one of the most desirable skills in the industries we serve. Usage of material about Software Architecture rose 5.5% from 2023 to 2024, a small but significant increase. But it’s more important to ask why it increased. A position in software architecture may be perceived as more secure in a time of layoffs, and it’s often perceived as another step forward in a career that moves from junior programmer to senior to lead. In addition, the rise of AI presents many architectural challenges: Do we try to build our own model? (The answer is usually “no.”) Should we use an AI service provider like OpenAI, Anthropic, Microsoft, or Google, or should we fine-tune and host our own model on our own infrastructure? How do we build applications that are safe (and how do we define “safe”)? How do we evaluate performance? These questions all have a bearing on software architecture. Furthermore, AI might provide tools to help software architects, but so far, these tools can do little for the substance of the job: understanding customers’ needs and helping them define what they want to build. With AI in the picture, we’re all building new kinds of applications—and those applications require architects to help design them.

In this context, it’s no surprise that Enterprise Architecture is up 17% and Distributed Systems is up 35%. Enterprise architecture is a staple: As Willie Sutton said about banks, “That’s where the money is.” It’s a good bet that many enterprises are trying to integrate AI into their systems or update legacy systems that are no longer scalable or maintainable. We can (and do) make the same argument about distributed systems. Modern enterprises work on a scale that was unimaginable a few decades ago. Scale isn’t just for companies like Amazon and Google. To survive, even small businesses need to develop an online presence—and that means building systems in the cloud that can handle surges in demand gracefully. It means building systems that can withstand outages. Distributed systems aren’t just massive deployments with hundreds of thousands of nodes. Your business may only require a dozen nodes, but regardless of the scale, it still faces the architectural challenges that come with distributed systems.

Some of the more significant ideas from the past decade seem to be falling out of favor. Microservices declined 24%, though content use is still substantial. Domain-Driven Design, which is an excellent skill for designing with microservices, is down 22%. Serverless is down 5%; this particular architectural style was widely hyped and seemed like a good match for microservices but never really caught on, at least based on our platform’s data.

What’s happening? Microservice architectures are difficult to design and implement, and they aren’t always appropriate—from the start, the best advice has been to begin by building a monolith, then break the monolith into microservices when it becomes unwieldy. By the time you reach that stage, you’ll have a better feel for what microservices need to be broken out from the monolith. That’s good advice, but the hype got ahead of it. Many organizations that would never need the complexity of microservices were trying to implement them with underskilled staff. As an architectural style, microservices won’t disappear, but they’re no longer getting the attention they once were. And new ideas, like modular monoliths, may catch on in the coming years; modularity is a virtue regardless of scale or complexity.

Software architecture and design

Programming languages

Last year’s report showed that our users were consuming less content about programming languages. This year’s data continues that trend. We see a small drop for Python (5.3%) and a more significant drop for Java (13%). And even C++, which showed healthy growth from 2022 to 2023, is down 9% in 2024.

On the other hand, C is up (1.3%), and so is C# (2.1%). Rust is up 9.6%. The small increases in C and C# may just be noise. C is well-entrenched and isn’t going anywhere fast. Neither is C++, despite its drop. Rust’s increase continues a growth trend that stretches back several years; that’s an important signal. Rust is clearly winning over developers, at least for new projects. Now that the US government is placing a priority on memory safety, Rust’s emphasis on memory safety serves it well. Rust isn’t the first programming language to claim memory safety, nor will it be the last. (There are projects to add memory safety to C++, for example.) But right now, it’s the best positioned.

Aside from Rust, though, we need to ask what’s happening with programming skills. A few forces are applying downward pressure. Industry-wide layoffs may be playing a role. We’ve downplayed the effect of layoffs in the past, but we may have to admit that we were wrong: This year, they may be taking a bite out of skills development.

Could generative AI have had an effect on the development of programming language skills? It’s possible; shortly after GPT-3 was released, Simon Willison reported that he was learning Rust with the help of ChatGPT and Copilot, and more recently that he’s used Claude to write Rust code that he has in production, even though he doesn’t consider himself a skilled Rust developer.

It would be foolish to deny that generative AI will help programmers to become more productive. And it would be foolish to deny that AI will change how and what we learn. But we have to think carefully about what “learning” means, and why we learn in the first place. Programmers won’t have to remember all the little details of programming languages—but that’s never been the important part of programming, nor has rote memorization been an important part of learning. Students will never have to remember a half dozen sorting algorithms, but computer science classes don’t teach sorting algorithms because committing algorithms to memory is important. Every programming language has a sort() function somewhere in its libraries. No, sorting is taught because it’s a problem that everyone can understand and that can be solved in several different ways—and each solution has different properties (performance, memory use, etc.). The point is learning how to solve problems and understanding the properties of those solutions. As Claire Vo said in her episode of Generative AI in the Real World, we’ll always need engineers who think like engineers—and that’s what learning how to solve problems means. Whether lines end in a semicolon or a colon or whether you use curly braces, end statements, or tabs to delimit blocks of code is immaterial.

Programming languages

The perception that generative AI minimizes the need to learn programming languages may limit the use of language-oriented content on our platform. Does that benefit the learners? If someone is using AI to avoid learning the hard concepts—like solving a problem by dividing it into smaller pieces (like quicksort)—they are shortchanging themselves. Shortcuts rarely pay off in the long term; coding assistants may help you to write some useful code, but those who use them merely as shortcuts rather than as learning tools are missing the point. Unfortunately, the history of teaching—going back centuries if not millennia—has stressed memorization. It’s time for both learners and teachers to grow beyond that.

Learning is changing as a result of AI. The way we teach, and the way our users want to be taught, is changing. Building the right kind of experiences to facilitate learning in an AI-enabled environment is an ongoing project for our learning platform. In the future, will our users learn to program by completing AI-generated tutorials that are customized in real time to their needs and abilities? That’s where we’re headed.

Web programming

Use of content about web programming skills is down, with few exceptions. A number of factors might be contributing to this. First, I can’t think of any significant new web frameworks in the past year; the field is still dominated by React (down 18%) and Angular (down 10%). There is some life near the bottom of the chart. The Svelte framework had significant growth (24%); so did Next.js (8.7%). But while these frameworks have their adherents, they’re far from dominant.

PHP (down 19%) still claims to have built the lion’s share of the web, but it’s not what developers reach for when they want to build something new, particularly if that “new” is a complex web application. The PHP world has been rocked by a bitter fight between the CEOs of Automattic (the developers of WordPress, by far the most important PHP framework) and WP Engine (a WordPress hosting platform). That fight started too late to affect this year’s results significantly, but it might weigh heavily next year.

A more significant development has been the movement away from complex platforms and back toward the simplicity of the earlier web. Alex Russell’s “Reckoning” posts summarize many of the problems. Our networks and our computers are much, much faster than they were 20 or 25 years ago, but web performance hasn’t improved noticeably. If anything, it’s gotten worse. We still wait for applications to load. Applications are hard to develop and have gotten harder over the years. There are several new frameworks that may (or may not) be lighter-weight, such as HTMXLudicGlitch, and Cobalt. None of them have yet made a dent in our data, in part because none have built enough of a following for publishers and trainers to develop content—and you can’t have any units viewed if there isn’t anything to view. However, if you want an experience that isn’t dominated by heavyweight frameworks, doesn’t require you to become a JavaScript expert, and puts the fun back into building the web, this is where to look.

Web development

Web dev is a discipline that has been ill-served by shortcuts to learning. We hear too often about boot camp graduates who know a few React tricks but don’t understand the difference between React and JavaScript (or even know that JavaScript exists, let alone other programming languages). These programmers are very likely to lose their jobs to AI, which can already reproduce all the basic React techniques they’ve learned. Learning providers need to think about how AI is changing the workplace and how their students can partner with AI to build something beyond what AI can build on its own. Part of the solution is certainly a return to basics, ensuring that junior developers understand the tools with which they’re working.

IT Operations

Operations is another area where the trends are mostly downward. It may be small consolation, but the drops for several of the most important topics are relatively small: Linux is down 1.6%, Terraform is down 4.0%, and Infrastructure as Code is down 7.3%. As a skill, Terraform seems little hurt by the fork of Terraform that created the open source OpenTofu project, perhaps because the OpenTofu developers have been careful to maintain compatibility with Terraform. How this split plays out in the future is an open question. It’s worth noting the precipitous drop in Terraform certification (down 43%); that may be a more important signal than Terraform itself.

Kubernetes is down 20%. Despite that drop, which is sharper than last year’s 6.9% decrease, content teaching Kubernetes skills remains the second most widely used group in this category, and Kubernetes certification is up 6.3%. Last year, we said that Kubernetes needed to be simpler. It isn’t. There are no viable alternatives to Kubernetes yet, but there are different ways to deploy it. Kubernetes as a service managed by a cloud provider is certainly catching on, putting the burden of understanding every detail of Kubernetes’s operation on the shoulders of the provider. We also pointed to the rise of developer platforms; this year, the buzzword is “platform engineering” (Camille Fournier and Ian Nowland’s book is excellent), but as far as Kubernetes is concerned, it’s the same thing. Platform engineers can abstract knowledge of Kubernetes into a platform, minimizing software developers’ cognitive overhead. The result is that the number of people who need to know about Kubernetes is smaller.

Both DevOps (down 23%) and SRE (down 15%) dropped. There’s certainly some frustration with DevOps: Has it paid off? We ask a different question: Has it ever been tried? One problem with DevOps (which it shares with Agile) is that many companies “adopted” it in name but not in essence. They renamed a few positions, hired a few DevOps engineers, maybe created a DevOps group, never realizing that DevOps wasn’t about new job titles or new specialties; it was about reducing the friction between software development teams and operations teams. When you look at it this way, creating new groups and hiring new specialists can only be counterproductive. And the result is predictable: You don’t have to look far to find blogs and whitepapers claiming that DevOps doesn’t work. There’s also frustration with ideas like “shift left” and DevSecOps, which envision taking security into account from the start of the development process. Security is a different discussion, but it’s unclear how you build secure systems without taking it into account from the start. We’ve spent several decades building software and trying to fold security in at the last minute—we know how well that works.

Infrastructure and operations

In any case, the industry has moved on. Platform engineering is, in many ways, a natural outgrowth of both DevOps and SRE. As I’ve argued, the course of operations has been to increase the ratio of computers to operators. Is platform engineering the next step, allowing software developers to build systems that can handle their own deployment and routine operations without the help of operations staff?

IT certifications

General IT certifications, apart from security, trended downward. Use of content to prepare for the CompTIA A+ exam, an entry-level IT certification, was down 15%; CompTIA Network+ was down 7.9%. CompTIA’s Linux+ exam held its own, with a decline of 0.3%. On our platform, we’ve seen that Linux resources are in high demand. The slight decline for Linux-related content (1.6%) fits with the very small decrease in Linux+ certification.

For many years, Cisco’s certifications have been the gold standard for IT. Cisco Certified Network Associate (CCNA), a fairly general entry-level IT certification, showed the greatest usage and the smallest decline (2.2%). Usage of content to prepare for the Cisco Certified Network Practitioner (CCNP) exams, a cluster of related certifications on topics like enterprise networking, data centers, and security, dropped 17%. The Cisco Certified Internet Engineer (CCIE) exams showed the greatest decline (36%). CCIE has long been recognized as the most comprehensive and in-depth IT certification. We’re not surprised that the total usage of this content is relatively small. CCIE represents the climax of a career, not the start. The number of people who attain it is relatively small, and those who do often include their CCIE number with their credentials. But the drop is surprising. It’s certainly true that IT is less focused on heavy-duty routing and switching for on-prem data centers (or even smaller machine rooms) than it was a few years ago. That work has largely been offloaded to cloud providers. While routers and switches haven’t disappeared, IT doesn’t need to support as wide a range of resources: They need to support office WiFi, some databases that need to remain on-premises, and maybe a few servers for office-related tasks. They’re very concerned about security, and as we’ll see shortly, security certifications are thriving. Is it possible that Cisco and its certifications aren’t as relevant as they used to be?

As we mentioned above, we also saw a drop in the relatively new certification for HashiCorp’s Terraform (43%). That’s a sharp decline—particularly since use of content about Terraform itself only declined 4.0%, showing that Terraform skills remain highly desirable regardless of the certification. A sudden drop in certification prep can be caused by a new exam, making older content out-of-date, but that isn’t the case here. Terraform certification certainly wasn’t helped by HashiCorp’s switch to a Business Source License or the subsequent fork of the Terraform project. IBM’s pending acquisition of Terraform (set to close before the end of 2024) may have introduced more uncertainty. Is the decline in interest for Terraform certification an indicator of dissatisfaction in the Terraform community?

Certifications for IT

The Kubernetes and Cloud Native Associate (KCNA, up 6.3%) was a bright spot in IT certification. Whether or not Kubernetes is overly complex (perhaps because it’s overly complex) and whether or not companies are moving out of the cloud, KCNA certification is a worthwhile asset. Cloud native applications aren’t going away. And whether they’re managing Kubernetes complexity by building developer platforms, using a Kubernetes provider, or using some other solution, companies will need people on their staff who can demonstrate that they have Kubernetes skills.

Cloud and cloud certifications

Content use for the major cloud providers and their certifications was down across all categories, with one exception: Use of content to prepare for Google Cloud certifications is up 2.2%.

What does that tell us, if anything? Are we looking at a “cloud repatriation” movement in full swing? Are our customers moving their operations back from the cloud to on-prem (or hosted) data centers? Last year, we said that we see very little evidence that repatriation is happening. This year? An article in The New Stack argues that cloud repatriation is gathering steam. While that might account for the decline in the use of cloud-related content, we still see little evidence that repatriation is actually happening. Two case studies (37signals and GEICO) don’t make a trend. The ongoing expense of operating software in the cloud probably is greater than the cost of running it on-premises. But the cloud allows for scaling on demand, and that’s important. It’s true, few businesses have the sudden usage peaks that are driven by events like retail’s Black Friday. But the cloud providers aren’t just about sudden 10x or 100x bursts of traffic; they also allow you to scale smoothly from 1x to 1.5x to 2x to 3x, and so on. It saves you from arguing that you need additional infrastructure until the need becomes a crisis, at which point, you don’t need to grow 1.5x; you need 5x. After moving operations to the cloud and experiencing a few years of growth—even if that growth is moderate—moving back to an on-premises data center will require significant capital expense. It will probably require gutting all the infrastructure that you haven’t been using for the past year and replacing it with something up-to-date.

Does this mean that cloud providers are “roach motels,” where you can move in but you can’t move out? That’s not entirely untrue. But the ease of scaling by allocating a few more servers and seeing a slightly higher bill the next month can’t be ignored, even if those slightly higher bills sound like the proverbial story of boiling the frog. Evaluating vendors, waiting for delivery, installing hardware, configuring hardware, testing hardware—that’s effort and expense that businesses are offloading to cloud vendors. The ability to scale fluidly is particularly important in the age of AI. Few companies have the skills needed to build on-premises infrastructure for AI, with its cooling and power requirements. That means either buying AI services directly from cloud providers or building infrastructure to host your own models. And of course, the cloud providers have plenty of help for companies that need to use their high-end GPUs. (Seriously—if you want to host your AI application on-premises, see how long it will take to get delivery of NVIDIA’s latest GPU.) The reality, as IDC concluded in a survey of cloud use, is that “workload repatriation from public cloud into dedicated environments goes hand in hand with workload migration to public cloud activities, reflecting organizations’ continuous reassessment of IT environments best suited for serving their workloads.” That is, there’s a constant ebb and flow of workloads to and from public clouds as companies adapt their strategies to the business environment.

Cloud providers and certifications

The buzzword power of “the cloud” lasted longer than anyone could reasonably have expected, but it’s dead now. However, that’s just the buzzword. Companies may no longer be “moving to the cloud”; that move has already happened, and their staff no longer need to learn how to do it. Organizations now need to learn how to manage the investments they’ve made. They need to learn which workloads are most appropriate for the cloud and which are better run on-premises. IT still needs staff with cloud skills.

Security

Security Governance drove the most content use in 2024, growing 7.3% in the process and overtaking Network Security (down 12%). The rise of governance is an important sign: “Security” is no longer an ad hoc issue, fixing vulnerabilities in individual applications or specific services. That approach leads to endless firefighting and eventually failure—and those failures end up in the major news media and result in executives losing their jobs. Security is a company-wide issue that needs to be addressed in every part of the organization. Confirming the growing importance of security governance, interest in Governance, Risk, and Compliance (GRC) grew 44%, and Compliance grew 10%. Both are key parts of security governance. Security architecture also showed a small but significant increase (3.7%); designing a security architecture that works for an entire organization is an important part of looking at the overall security picture.

The use of content about Application Security also grew significantly (17%). That’s a very general topic, and it perhaps doesn’t say much except that our users are interested in securing their applications—which goes without saying. But what kinds of applications? All of them: web applications, cloud applications, business intelligence applications, everything. We get a bigger signal from the increase in Zero Trust (13%), a particularly important strategy for securing services in which every user, human or otherwise, must authenticate itself to every service that it uses. In addition, users must have appropriate privileges to do what they need to do, and no more. It’s particularly important that zero trust extends authentication to nonhuman users (other computers and other services, whether internal or external). It’s a response to the “hard, crunchy outside, but soft chewy inside” security that dominated the 1990s and early 2000s. Zero trust assumes that attackers can get through firewalls, that they can guess passwords, and that they can compromise phones and computers when they’re outside the firewall. Firewalls, good passwords, and multifactor authentication systems are all important—they’re the hard, crunchy outside that prevents an attacker from getting in. Zero trust helps keep attackers outside, of course—but more than that, it limits the damage they can do once they’re inside.

Security skills

We’re puzzled by the drop in use of content about Network Security, which corresponds roughly to the drop in Cisco certifications. Network Security is still the second most widely used skill, but it’s down 12% from 2023 to 2024. Perhaps network security isn’t deemed as important when employees wander in and out of company networks and applications are distributed between in-house servers and the cloud. We hope that our users aren’t making that mistake. A bigger issue is that networks haven’t changed much in the past few years: We’re still using IPv4; we’re still using routers, switches, and firewalls, none of which have changed significantly in recent years. What has changed is the way security is implemented. Cloud computing and zero trust have moved the focus from big-iron networking devices to interactions between systems, regardless of how they are connected.

Security certifications

Security certification has been one of the biggest growth areas on our platform. As I’ve said elsewhere, security professionals love their certifications. There’s a good reason for that. In most other specialties, it’s possible to build a portfolio of programs you wrote, systems you architected, sites you’ve designed. What can a security person say in a job interview? “I stopped 10,000 people from logging in last year?” If you’ve ever monitored a public-facing Linux system, you know that claim means little. Security is cursed with the problem that the best news is no news: “Nothing bad happened” doesn’t play well with management or future employers. Neither does “I kept all the software patched, and spent time reading CVEs to learn about new vulnerabilities”—even though that’s an excellent demonstration of competence. Certification is a way of proving that you have certain skills and that you’ve met some widely recognized standards.

The CISSP (up 11%) and CompTIA Security+ (up 13%) certifications are always at the top of our lists, and this year is no exception. Our State of Security in 2024 report showed that CISSP was the certification most commonly required by employers. If there’s a gold standard for security skills, CISSP is it: It’s a thorough, comprehensive exam for people with more than five years of experience. CompTIA Security+ certification has always trailed CISSP slightly in our surveys and in platform performance, but its position in second place is uncontested. Security+ is an entry-level certification; it’s particularly desirable for people who are starting their security careers.

Security certification was especially important for government users. For most industry sectors, usage focused on programming skills in Java or Python, followed by artificial intelligence. The government sector was a strong outlier. Security and IT certifications were by far the most important topics. CompTIA Security+ and CISSP (in that order) led.

Moving beyond CISSP and Security+, many of the other security certifications also showed gains. Certified Ethical Hacker (CEH) was up 1.4%, as was the less popular CompTIA PenTest+ certification (3.3%). Certified Cloud Security Professional was up 2.4%, somewhat less than we’d expect, given the importance of the cloud to modern IT, but it’s still a gain. ISACA’s Certified in Risk and Information Systems Control (CRISC) was up 45%, Certified Information Security Manager (CISM) grew 9.3%, and Certified Information Security Auditor (CISA) was up 8.8%; these three certifications are strongly associated with security governance. The most significant declines were for the CompTIA Cybersecurity Analyst (CySA+) certification (down 13%) and CCNA Security (down 55%). The drop in CCNA Security is extreme, but it isn’t unexpected given that none of the Cisco certifications showed an increase this year.

We’re missing one important piece of the security certification puzzle. There’s no data on AI security certifications—and that’s because there aren’t any. Software that incorporates AI must be built and operated securely. That will require security experts with AI expertise (and who can demonstrate that expertise via certifications). We expect (or maybe a better phrase is “we hope”) that lack will be addressed in the coming year.

Security certifications

Professional Development

Professional development continues to be an important growth area for our audience. The most important skill, Professional Communication, grew 4.5%—not much but significant. We saw a 9.6% increase in users wanting to know more about Engineering Leadership, and a 21.5% increase in users using content about Personal Productivity.

Project Management was almost unchanged from 2023 to 2024 (up 0.01%), while the use of content about the Project Management Professional (PMP) certification grew 15%. Interest in Product Management declined 11%; it seems to be a skill that our users are less interested in. Why? For the past few years, product manager has seemed to be a trendy new job title. But in last year’s report, Product Management only showed a small gain from 2022 to 2023. Is interest in Product Management as a skill or as a job title fading?

Professional development and skills

We also saw a 7.9% decline in Leadership (aside from Engineering Leadership), and a huge 35% decline for IT Management. Are we to blame these on the corporate layoff cycle? That’s possible, but it’s too easy. IT may be affected by a general trend toward simplification and platform engineering, as we’ve discussed: A platform engineering group can do a lot to reduce cognitive overhead for developers, but it also reduces the need for IT staff. A platform engineering group doesn’t have to be large; is the need for IT staff shrinking? The decline in Leadership may be because it’s a vague, nonspecific term, unlike Engineering Leadership (which is up). Engineering Leadership is concrete and it’s something our engineering-oriented audience understands.

New Initiatives

In 2024, we introduced several new features on the O’Reilly learning platform, including badges, quizzes, and a new version of O’Reilly Answers. What are they telling us?

Badges and Quizzes

We started a badging program late in 2023: Users from business accounts can earn badges for taking courses and completing quizzes. We won’t go into the program details here, but since the program started, users have earned nearly 160,000 badges. We’re still building the program, but we’re encouraged by its first year.

Badges can give us more insight into what our users are learning. The most popular badges are for Python skills, followed by GPT and prompt engineering. Generative AI and machine learning are also high on the list. Kubernetes, despite its decline in units viewed, was the fourth-most-frequently-acquired badge, with almost the same number of badges earned as software architecture. Linux, SQL, professional communication, and Java rounded out the top 11. (Yes, 11—we wanted to include Java). The difference between Java and Python is striking, given that the use of content about these skills is similar. (Python leads Java, but not by much.) Oracle has a highly regarded Java certification program, and there’s really no equivalent for Python. Perhaps our users recognize that obtaining a Java badge is superfluous, while obtaining badges for Pythonic skills is meaningful?

Quizzes are closely tied to badges: If a final quiz exists for a course or for a book, students must pass it to earn their badge. Quiz usage appears to follow the same trends as badging, though it’s premature to draw any conclusions. While a few legacy quizzes have been on the platform for a long time (and aren’t connected to badging), the push to develop quizzes as part of the badging program only began in June 2024, and quiz usage is still as much a consequence of the time the quiz has been available on the platform as it is of the skill for which it’s testing.

Top badges earned (relative to Python)

We can also look at the expertise required by the badges that were earned. All of our content is tagged with a skill level: beginner, beginner-intermediate, intermediate, intermediate-advanced, or advanced. 42% of the badges were earned for content judged to be intermediate. 33% of the badges were earned for beginner content, while only 4.4% were for advanced content. It’s somewhat surprising that most of the badges were earned for intermediate-level content, though perhaps that makes sense given the badge program’s B2B context: For the most part, our users are professionals rather than beginners.

Badges earned by expertise level (percent)

Answers

One of our most important new features in 2024 was an upgrade to O’Reilly Answers. Answers is a generative AI-powered tool that allows users to enter natural language questions and generates responses from content in our platform. Unlike most other generative AI products, Answers always provides links to the original sources its responses are based on. These citations are tracked and used to calculate author royalties and payments to publishing partners.

So the obvious question is: What are our users asking? One might guess that the questions in Answers would be similar to the search terms used on the platform. (At this point, Answers and search are distinct from each other.) That guess is partly right—and partly wrong. There are some obvious differences. Common search terms include book titles, author names, and even ISBNs; titles and author names rarely appear in Answers. The most common searches are for single words, such as “Python” or “Java.” (The average length of the top 5,000 searches in September 2024 was two words, for instance.) There are few single word questions in Answers (though there are some); most questions are well-formed sentences like “How many ways can you create a string object in Java?” (The average question length was nine words.)

To analyze the questions from O’Reilly Answers, we essentially turned them back into single-word questions. First, we eliminated questions from a “question bank” that we created to prime the pump, as it were: Rather than requiring users to write a new question, we offered a list of prewritten queries they could click on. While there’s undoubtedly some useful signal in how the question bank was used, we were more interested in what users asked of their own volition. From the user-written questions, we created a big “bag of words,” sorted them by frequency, and eliminated stopwords. We included a lot of stopwords that aren’t in most lists: words like “data” (what does that mean by itself?) and “chapter” (yes, you can ask about a chapter in a book, but that doesn’t tell us much).

With that background in mind, what were the most common words in Answers and in searches? In order:

AnswersSearch Queries
PythonPython
JavaMachine learning
ManagementKubernetes
KeyJava
ModelRust
SecurityReact
FileAWS
ArchitectureCISSP
AIC++
SystemLinux
ServiceDocker
ProjectSQL
LearningJavaScript

There’s an obvious difference between these two lists. The Answers list consists mostly of words that could be part of longer questions. The Search list is made up of topics and skills about which one might want information. That’s hardly surprising or insightful. We’ve said most searches on the platform are single-word searches, which means that those words have to be stand-alone skills or topics, like Python or Java. Likewise, Answers was built to allow users to ask more detailed, in-depth questions and get focused answers from the content on our platform—so rather than seeing single word searches, we’re seeing common words from longer questions. Maybe that’s a self-fulfilling prophecy, but it’s also showing that Answers is working the way we intended.

There’s a little more signal here. Python and Java are the two top programming languages on both lists, but if we look at search queries, machine learning and Kubernetes are sandwiched between the two languages. That may just be a result of our users’ experiences with services like ChatGPT. Programmers quickly learned that they can get reasonable answers to questions about Java and Python, and the prompts don’t have to be very complex. My personal favorite is “How do you flatten a list of lists in Python?,” which can be answered by most chatbots correctly but isn’t meaningful to our search engine.

Kubernetes raises a different question: Why is it the third-most-common search engine query but doesn’t appear among the top words on Answers? (It’s the 90th-most-common word on Answers, though the actual rank isn’t meaningful.) While Kubernetes is a topic that’s amenable to precise questions, it’s a complex tool, and coming up with precise prompts is difficult; writing a good question probably requires a good understanding of your IT infrastructure. You might need to understand how to solve your problem before you can ask a good question about how to solve your problem. A search engine doesn’t face problems like this. It doesn’t need additional information to return a list of resources.

Then what about words like Rust and Linux, which are high on the list of common searches, but not in the top 13 for Answers? It’s relatively easy to come up with specific questions about either of these—or, for that matter, about SQL, AWS, or React. SQL, AWS, and Linux are reasonably close to the top of the Answers word list. If we just concern ourselves with the order in which words appear, things start to fall into place: AWS (and cloud) follow learning; they are followed by Linux, followed by SQL. We’re not surprised that there are few questions about CISSP on Answers; it’s a certification exam, so users are more likely to want test prep material than to ask specific questions. Rust and React are still outliers, though; it’s easy to ask precise and specific questions about either of them. Rust is still unfamiliar to many of our users—could the explanation be that our customers want to learn Rust as a whole rather than ask specific questions that might only occur to someone who’s already learned the language? But if you accept that, React still remains an outlier. We may know the answers next year, at which time we’ll have a much longer track record with Answers.

The Coming Year

That wraps up last year. What will we see this year? We’ve given hints throughout this report. Let’s bring it all together.

AI dominated the news for 2024. It will continue to do so in 2025, despite some disillusionment. For the most part, those who are disillusioned aren’t the people making decisions about what products to build. While concern about jobs is understandable in a year that’s seen significant layoffs, we don’t believe that AI is “coming for your job.” However, we do believe that the future will belong to those who learn how to use AI effectively—and that AI will have a profound impact on every profession, not just IT and not just “knowledge workers.” Using AI effectively isn’t just about coming up with clever prompts so you can copy and paste an answer. If all you can do is prompt, copy, and paste, you’re about to become superfluous. You need to figure out how to work with AI to create something that’s better than what the AI could do by itself. Training employees to use AI effectively is one of the best things a company can do to prepare for an AI-driven future. Companies that don’t invest in training will inevitably fall behind.

In the coming year, will companies build AI applications on top of the giant foundation models like GPT-4, Claude, and Gemini? Or will they build on top of smaller open models, many of which are based on Meta’s Llama? And in the latter case, will they run the models on-premises (which includes the use of hosting and colocation providers), or will they rent use of these open AI models as a service from various providers? In the coming year, watch carefully what happens with the small open models. They already deliver performance almost as good as the foundation models and will undoubtedly be the basis for many AI applications. And we suspect that most companies will run these models in the cloud.

Security is the other significant growth area. Companies are waking up to the need to secure their data before their reputations—and their bottom lines—are compromised. Waking up has been a long, slow process that has sunk the careers of many CEOs and CIOs, but it’s happening. Our users are studying to gain security certifications. We see companies investing in governance and putting in company-wide policies to maintain security. In this respect, AI cuts both ways. It’s both a tool and a danger. It’s a tool because security professionals need to watch over huge streams of data, looking for the anomalies that signal an attack; it’s a tool because AI can digest sources of information about new threats and vulnerabilities; it’s a tool because AI can automate routine tasks like report generation. But it’s also a danger. AI-enabled applications increase an organization’s threat surface by introducing new vulnerabilities, like prompt injection, that we’re only now learning how to mitigate. We haven’t yet seen a high-profile attack against AI that compromised an organization’s ability to do business, but that will certainly happen eventually—maybe in 2025.

Whatever happens this year, AI will be at the center. Everyone will need to learn how to use AI effectively. AI will inevitably reshape all of our professions, but we don’t yet know how; we’re only starting to get glimpses. Is that exciting or terrifying? Both.


Footnotes

  1. The definition of “open” and “open source” for AI is still controversial. Some open models don’t include access to weights, and many don’t include access to training data.

New Glenn to make another launch attempt early Thursday

Blue Origin announced late on Monday afternoon that it planned to make a second attempt to launch the New Glenn rocket at 1 am ET (06:00 UTC) on Tuesday. But then, a couple of hours later, the company said it would move the launch to Thursday.

Although the company provided no information about why it was slipping the launch two more days, it likely involved both technical work after an initial launch scrub on Monday morning and concerns about weather early on Tuesday.

In its short update on Monday afternoon, Blue Origin confirmed earlier reporting by Ars that the first launch attempt on Monday morning was scrubbed due to ice buildup on a vent line. "This morning’s scrub was due to ice forming in a purge line on an auxiliary power unit that powers some of our hydraulic systems," the company said.

Read full article

Comments

Farnborough International Space Show & ISRSE-40 announce first wave of speakers

Leaders from NASA, Babcock International Group (Babcock), Lockheed Martin BAE Systems, UK Ministry of Defence, and US Government have been confirmed as speakers for the Farnborough International Space Show

The post Farnborough International Space Show & ISRSE-40 announce first wave of speakers appeared first on SpaceNews.

The Trump administration should leverage private space stations to counter China

The International Space Station, photographed in 2021. Credit: NASA

We stand on the brink of a transformative era in space exploration: a shift from government-led to commercial-led activities off-planet. With this shift comes the need to recognize that the […]

The post The Trump administration should leverage private space stations to counter China appeared first on SpaceNews.

The Borda Count is the Best Method of Voting

It’s well known that the voting methods we use are highly defective, as they fail to meet fundamental criteria like positive responsiveness, the Pareto principle, and stability. Positive responsiveness (monotonicity) means that if a candidate improves on some voters’ ballots, this should not reduce the candidate’s chances of winning. Yet, many voting methods, including runoffs and ranked-choice voting, fail positive responsiveness. In other words, candidates who became more preferred by voters can end up losing when they would have won when they were less preferred! It’s even more shocking that some voting systems can fail the Pareto principle, which simply says that if every voter prefers x to y then the voting system should not rank y above x. Everyone knows that in a democracy a candidate may be elected that the minority ranks below another possible candidate but how many know that there are democratic voting procedures where a candidate may be elected that the majority ranks below another possible candidate or even that democratic voting procedures may elect a candidate that everyone ranks below another possible candidate! That is the failure of the Pareto principle and the chaos results of McKelvey–Schofield show that this kind of outcome should be expected.

Almost all researchers in social choice understand the defects of common voting systems and indeed tend to agree that the most common system, first past the post voting, is probably the most defective! But, as no system is perfect, there has been less consensus on which methods are best. Ranked choice voting, approval voting and the Borda Count all have their proponents. In recent years, however, there has been a swing towards the Borda Count.

Don Saari, for example, whose work on voting has been a revelation, has made strong arguments in favor of the Borda Count. The Borda Count has voter rank the n candidates from most to least preferred and assigns (n-1) points to the candidates. For example if there are 3 candidates a voter’s top-ranked candidate gets 2 points, the second ranked candidate gets 1 point and the last ranked candidate 0 points. The candidate with the most points overall wins.

The Borda Count satisfies positive responsiveness, the Pareto principle and stability. In addition, Saari points out that the Borda Count is the only positional voting system to always rank a Condorcet winner (a candidate who beats every other candidate in pairwise voting) above a Condorcet loser (a candidate who loses to every other candidate in pairwise voting.) In addition, all voting systems are gameable, but Saari shows that the Borda Count is by some reasonable measures the least or among the least gameable systems.

My own work in voting theory shows, with a somewhat tongue in cheek but practical example, that the Borda Count would have avoided the civil war! I also show that other systems such as cumulative voting or approval voting are highly open to chaos, as illustrated by the fact that under approval voting almost anything could have happened in the Presidential election of 1992, including Ross Perot as President.

One reason the Borda Count performs well is that it uses more information than other systems. If you just use a voter’s first place votes, you are throwing out a lot of information about how a voter ranks second and third candidates. If you just use pairwise votes you are throwing out a lot of information about the entire distribution of voter rankings. When you throw out information the voting system can’t distinguish rational from irrational voters which is one reason why the outcomes of a voting system can look irrational.

Eric Maskin has an important new contribution to this literature. Arrow’s Independence of Irrelevant Alternatives (IIA) says that if no voters change their rankings of x and y then the social ranking of x and y shouldn’t change. In other words, if no voter changes their ranking of Bush and Gore then the outcome of the election shouldn’t change regardless of how Nader is ranked (for the pedantic I exclude the case where Nader wins.) The motivation for IIA seems reasonable, we don’t want spoilers who split a candidate’s vote allowing a less preferred candidate, even a Condorcet loser to win. But IIA also excludes information about preference intensity from the voting system and throwing out information is rarely a good idea.

What Maskin shows is that it’s possible to keep the desirable properties of IIA while still measuring preference intensity with what he calls modified IIA, although in my view a better name would be middle IIA. Modified or middle IIA says that an alternative z should be irrelevant unless it is in the middle of x and y, e.g. x>z>y. More precisely, we allow the voting system to change the ranking of x and y if the ranking of z moves in or out of the middle of x and y but not otherwise (recall IIA would forbid the social ranking of x and y to change if no voter changes their ranking of x and y).

Maskin shows that the Borda Count is the only voting system which satisfies MIIA and a handful of other desirable and unobjectionable properties. It follows that the Borda Count is the only voting system to both measure preference intensity and to avoid defects such as a spoilers.

The debates over which is the best voting system will probably never end. Indeed, voting theory itself tells us that multi-dimensional choice is always subject to some infirmities and people may differ on which infirmities they are willing to accept. Nevertheless, we can conclude that plurality rule is a very undesirable voting system and the case for the Borda Count is strong.

The post The Borda Count is the Best Method of Voting appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

The First Password on the Internet

It was created in 1973 by Peter Kirstein:

So from the beginning I put password protection on my gateway. This had been done in such a way that even if UK users telephoned directly into the communications computer provided by Darpa in UCL, they would require a password.

In fact this was the first password on Arpanet. It proved invaluable in satisfying authorities on both sides of the Atlantic for the 15 years I ran the service ­ during which no security breach occurred over my link. I also put in place a system of governance that any UK users had to be approved by a committee which I chaired but which also had UK government and British Post Office representation.

I wish he’d told us what that password was.

Congressman watches for potential changes at NASA under Trump administration

Ivey

The congressman whose district includes NASA’s Goddard Space Flight Center says he is taking a wait-and-see approach to potential changes to the center.

The post Congressman watches for potential changes at NASA under Trump administration appeared first on SpaceNews.

Martijn Blanken, Neo Space Group – Commercial Space Transformers

Martijn Blanken

In this episode of Commercial Space Transformers SpaceNews Senior Staff Writer Jason Rainbow speaks with Martijn Blanken, CEO, Neo Space Group (NSG).

The post Martijn Blanken, Neo Space Group – Commercial Space Transformers appeared first on SpaceNews.

NOAA sees new applications for commercial weather data

NEW ORLEANS – In addition to purchasing global datasets, the National Oceanic and Atmospheric Administration (NOAA) plans to pay a premium for observations of oil spills or other events. For […]

The post NOAA sees new applications for commercial weather data appeared first on SpaceNews.

Air Force chief’s parting warning: U.S. must transform Space Force to counter China

Frank Kendall: ‘We're going to need a much bigger, much more capable, much more powerful Space Force’

The post Air Force chief’s parting warning: U.S. must transform Space Force to counter China appeared first on SpaceNews.

★ One Bit of Anecdata That the Web Is Languishing Vis-à-Vis Native Mobile Apps

Just after New Year’s some sort of underground cable screw-up resulted in our home, along with an irregular swath of our neighborhood, losing electricity for 26 hours. We don’t lose power often, and when we do, the outages are usually brief, but 26 hours felt pretty long — especially with the outside temperature below freezing and daylight hours near their calendric nadir. The icing on this particular outage’s frustration cake was that our power company, PECO1, seemingly had no idea what exactly was wrong or how long it might take to fix.

The power went out around 10:30 am on January 2, and soon thereafter PECO was estimating that power would be restored by 2 pm. Then it was 4 pm, then it was briefly 2 pm again (despite the actual time then being after 2 pm — which is when I got the sinking feeling I should get the flashlights out), then they were claiming there were no known outages in our area, until eventually they just stopped providing any estimates at all of when our power might return. I’d have given PECO some credit for honesty if they’d simply replaced the estimated time for power restoration with the shrug emoji.

I was following along with these updates and checking the outage map from my iPhone, on PECO’s website. Which website I wasn’t at all familiar with, because our power really doesn’t go out very often, and my wife takes care of the bill. PECO’s is one of the worst websites I’ve ever had the misfortune to need to use. Among its faults:

  • It is incredibly slow to load. (This slowness couldn’t be explained by overwhelming demand — the power outage was not widespread.)
  • Pages often finished loading incompletely. Just some page header chrome at the top and nothing but white underneath. In fact I just tried right now, today, and got this.
  • Navigation is confusing, and even once I figured it out, it took multiple taps and page loads to get to the pages I wanted to return to. And those page loads were all slow to load.
  • Worst of all, most tasks you might want to do, including just checking on the status of an outage, seemingly require you to be signed in as a customer, but the website signs you out automatically after a few minutes. So each time I returned, I had to start by signing in again. Which, you’ll be surprised to hear, was slow and sometimes wouldn’t take on the first try, despite my credentials being auto-filled.

Basically, PECO’s mobile website feels like it was developed using and exported from Microsoft Excel. You might say, “Well that makes no sense, because you’ve never been able to build or export websites using Excel.” To which I’d respond, “Yes, exactly.

So, every time I wanted to see if there was an updated estimate on our power being restored, it took at least a minute or two of waiting for pages to load, signing back in (which was always slow), and poking around their inscrutable site navigation. The website did prompt me, occasionally, to install their mobile app, but I was like “Fuck that, it’s probably just their website in a wrapper.”

It was a cold and dark night, but our power was restored the next day just after noon,2 and it stayed restored, so I metaphorically dusted my hands and thought to myself, “I hope I never need to use that fucking website ever again.”

Last night, our power went out again. This time, thankfully, it was only out for about 80 minutes. When the outage hit, before even once trying PECO’s cursed website, I went to the App Store and installed their iPhone app. It was a revelation. PECO’s iOS app is everything their website is not: fast, well-organized, and, blessedly, keeps you signed in.3

I’d go so far as to describe PECO’s website, at least as experienced from a phone, as utterly incompetent. I’d describe their native iOS app as — I can’t believe I’m going to use this word — good. It’s hard to believe the website and app are from the same company.

This makes no sense to me. A utility company is the sort of thing where I’d expect most people would use them via the web, even from their phones. Who’d think to install an app from their electric company on their phone? But it’s a night and day difference. I feel like a chump for having suffered through the previous 26-hour outage obsessively checking their terrible, slow-loading (I just can’t emphasize how fucking slow it is), broken website when this app was available.

There’s absolutely no reason the mobile web experience shouldn’t be fast, reliable, well-designed, and keep you logged in. If one of the two should suck, it should be the app that sucks and the website that works well. You shouldn’t be expected to carry around a bundle of software from your utility company in your pocket. But it’s the other way around. I suspect that my instinctive belief that a service company or utility should focus its customer service efforts on the web first, and native apps second, is every bit as outdated as my stubborn belief that invite ought not be used as a noun. (Invitation is sitting right there.)

I won’t hold up this one experience as a sign that the web is dying, but it sure seems to be languishing, especially for mobile devices.4 And the notion that mobile web apps are closing the gap with native apps is laughable. The gulf between them is widening, not narrowing.


  1. They were The Philadelphia Electric Company for over a century before changing their official name in 1994. They should have kept the old name rather than rebrand, despite the fact that no one in Philly had ever called them anything but “PECO” for decades. Nobody here ever called Veterans Stadium anything other than “The Vet”, but it would’ve been stupid as hell to officially rename the late great concrete masterpiece of early 1970s brutalism to its nickname. ↩︎︎

  2. In what I’d hold up as yet another proof of Murphy’s Law, the power came back on while I was mostly done with a shower that wasn’t cold, per se, but certainly wasn’t warm, let alone properly hot. ↩︎︎

  3. While writing this column, I installed PECO’s Android app on my Pixel 4 and gave it a whirl. It shares a visual design with the iOS app — I strongly suspect they’re made from a shared code base and one of the various cross-platform frameworks. But where the iPhone app is fast (or at least fast enough), the Android app is slow. But I can’t say how much of that is from the app and how much because my Pixel 4 is five years old. But I also tried the iOS app on my iPhone 12 (four years old), and it felt snappy there too. ↩︎︎

  4. It’s kind of weird that there are now zillions of supposedly technically sophisticated people who, when they use the term “desktop app”, are referring to websites. I’ve personally mostly thought about this usage as a sign of the decline of native Mac apps. But it’s also a sign of the decline of building websites meant to be used on mobile phones. I think maybe what we’re seeing is not that the web, overall, is dying, but the mobile web is. ↩︎︎

Meta Is Blocking Links to Decentralized Instagram Competitor Pixelfed

Jason Koebler, 404 Media:

Meta is deleting links to Pixelfed, a decentralized Instagram competitor. On Facebook, the company is labeling links to Pixelfed.social as “spam” and deleting them immediately.

Pixelfed is an open-source, community-funded and decentralized image sharing platform that runs on Activity Pub, which is the same technology that supports Mastodon and other federated services. Pixelfed.social is the largest Pixelfed server, which was launched in 2018 but has gained renewed attention over the last week.

Bluesky user AJ Sadauskas originally posted that links to Pixelfed were being deleted by Meta; 404 Media then also tried to post a link to Pixelfed on Facebook. It was immediately deleted.

True free speech is the freedom to avoid seeing alternatives to Instagram.

 ★ 

Los Angeles Fires: How to Help

LA resident Matthew Butterick, in his MB XS newsletter:

Easy answer — donate money! A good friend of mine works in California disaster relief. He recommends these nonprofits because they have a strong local impact:

Donations of physical items are politely discouraged because they impose extra logistics and handling that relief and shelter organizations can’t support right now.

 ★ 

Rory Sykes, Killed in LA Wildfires, RIP

Josh DuBose, reporting for KTLA:

In an emotional interview, Shelley Sykes, the mother of former child actor Rory Sykes who died in their Malibu home amid the Palisades Fire, shared her harrowing story and grieved the devastating loss of her son. Shelley fought back tears recalling the final moments with 32-year-old Rory, who was born blind and lived with cerebral palsy.

On Jan. 7, when the Palisades Fire broke out, the mother and son stayed behind at their Malibu home believing they were safe. Overnight, though, as the wind-driven fire escalated and sent embers flying onto their property, a massive flare up trapped Rory, who has difficulty walking, inside his cottage.

“I drove up to the top of his cottage, turned on the hose pipe and no water came out of it,” Shelley explained. “I raced back down and dialed 911 but 911 wasn’t working and all the lines were down for emergencies.”

Despite her best efforts, she says Rory locked himself in his cottage and told his mother to save herself instead.

Shelley said that she grabbed her peacocks and drove down to try and get help, but when firefighters returned, the cottage as well as the main were completely destroyed by fire. Officials have yet to retrieve the former child star’s remains from the charred rubble of the cottage.

Sykes is one of 24 people known to have died so far, but at least 16 others remain missing. His mother, announcing his death on X, emphasized that he was an avid gamer and an Apple enthusiast. Turns out he was also apparently an avid Daring Fireball reader. Rory’s own X feed was full of links to DF posts, right up until the day before he died.

It’s an unusual relationship I have with you, my readers. All of you know me, to the extent that my writing and podcasting reveals who I am. I know relatively few of you. But when a friend pointed me to Sykes’s sad story — and my god, his poor mother, who couldn’t save him — and his X feed, it hit me.

I can’t say I knew Rory. It doesn’t seem like he ever emailed me, nor do we seem to have interacted on Twitter/X. But I’m glad my writing was a part of his life — and I’m glad it’s part of all of yours, too. I don’t know what more to say about it other than that this whole wildfire catastrophe is heartbreaking and awful, and a reminder of how fleeting and delicate everything in life is.

 ★ 

Bananas Bloomberg Report: ‘China Weighs Sale of TikTok’s US Operations to Elon Musk’

No byline, which is really weird, just “Bloomberg News”:

Chinese officials are evaluating a potential option that involves Elon Musk acquiring the US operations of TikTok if the company fails to fend off a controversial ban on the short-video app, according to people familiar with the matter.

But here’s Todd Spangler, reporting (with his name) for Variety:

TikTok denied a report that China is looking at potentially facilitating a sale of the app to tech billionaire Elon Musk to keep TikTok operational in America amid a looming U.S. government ban.

“We can’t be expected to comment on pure fiction,” a TikTok rep said in reply to Variety‘s request for comment.

It could be no one is wrong here. Maybe ByteDance’s owners, the Chinese government, know what’s going on, and the dupes at TikTok don’t.

 ★ 

[Sponsor] Protect Your App With WorkOS Radar

Does your app get fake signups, throwaway emails, or users abusing your free tier? Or worse, bots attacks and brute force attempts?

WorkOS Radar can block all this and more. A simple API gives you advanced device fingerprinting that can detect bad actors, bots, and suspicious behavior.

Your users trust you. Let’s keep it that way.

 ★ 

Mastodon Is Transferring Its Ownership to a New Non-Profit

The Mastodon Team blog:

Simply, we are going to transfer ownership of key Mastodon ecosystem and platform components (including name and copyrights, among other assets) to a new non-profit organization, affirming the intent that Mastodon should not be owned or controlled by a single individual.

When founder Eugen Rochko started working on Mastodon, his focus was on creating the code and conditions for the kind of social media he envisioned. The legal setup was a means to an end, a quick fix to allow him to continue operations. From the start, he declared that Mastodon would not be for sale and would be free of the control of a single wealthy individual, and he could ensure that because he was the person in control, the only ultimate decision-maker.

Though there’s a lot going on right now in the social media space, with Meta’s policy zig-zag last week still reverberating, this change seems like it’s more in response to avoiding what’s going on with WordPress and Matt Mullenweg, where “WordPress” is open source and the trademarks are owned by a foundation, but that foundation has licensed the WordPress commercial trademarks exclusively to Mullenweg’s for-profit company Automattic, to protect and wield as he sees fit.

The big difference is that WordPress is almost unfathomably popular, and Mastodon is a niche platform for sophisticated social networking users.

 ★ 

‘Free Our Feeds’

Free Our Feeds:

With Zuckerberg going full Musk last week, we can no longer let billionaires control our digital public square.

Bluesky is an opportunity to shake up the status quo. They have built scaffolding for a new kind of social web. One where we all have more say, choice and control.

But it will take independent funding and governance to turn Bluesky’s underlying tech — the AT Protocol — into something more powerful than a single app. We want to create an entire ecosystem of interconnected apps and different companies that have people’s interests at heart.

Free Our Feeds will build a new, independent foundation to help make that happen.

An open consortium built around consensus is exactly what’s needed to move fast and take advantage of the current moment’s opportunity. And with a team of “technical advisors and custodians” that includes both the executive director and the president of the Mozilla Foundation, I suspect this initiative might prove as successful as Firefox.

 ★ 

Don’t Blame Libs or Progs for Driving Silicon Valley to the Right

There’s currently a debate online about whether social media owners were always secretly or latently right wing or whether “progressives” took a business constituency that was a reliably friendly and financially generous ally and turned it into an enemy through relentless attacks. Needless to say, there are a lot of jangling threads to this story, details that are hard to wrestle into an overarching theory. There are Silicon Valley titans like Peter Thiel who have always been not simply right-wingers but advocates of weird, tech-infused neo-monarchism. There have also been various left-aligned campaigns that must have rankled various tech titans. And finally, it’s very important to remember that it’s not at all clear that Silicon Valley as a whole is moving right. Management is. But the real and big story is simpler and more structural. The major technology platforms became mature businesses at vast scales; in so doing they butted up against the regulatory purview of the national government; and with the former leading to the latter they shifted toward a more conventionally anti-regulatory politics. A lot of it is really that simple.

There’s an important additional, related point which is that on becoming mature businesses they began looking toward the federal government more and more to protect their business positions from new entrants or other threats.

Now, these are a lot of big generalities I’m throwing around here. So there’s a small bit of history I wanted to share which, I think, provides a good entry point into understanding the progression. Back in 2012, there was a big DC fight over something called SOPA, the Stop Online Piracy Act. The precise details needn’t concern us here. It was a pretty heavy-handed piece of legislation backed and created by the Motion Picture Association of America and other entertainment industry rights-holder groups. It would have made it pretty simple to go into court and get an overseas pirating website blocked in the United States. I discussed it in more detail both at the time and in this post back in 2017. All the groundwork had been laid for passage through conventional lobbying and the legislation had broad bipartisan support.

And then on a dime everything changed.

The big companies from Silicon Valley weighed in in opposition. They also set up a series of websites and groups to mobilize opposition, arguing that SOPA was a threat to online free speech. That argument kicked off an avalanche of opposition which was more or less genuinely grassroots in nature. And when I say grassroots I mean that once the message got out and the argument was made in opposition to SOPA and the big tech companies made clear they thought it was a bad idea, millions of ordinary people signed petitions and called representatives and the whole SOPA effort collapsed in a smoldering heap. Those on Capitol Hill who hadn’t paid much attention came out against it. And finally even its key backers came out against it.

Groups like the MPAA were trying to get the state to throw random people in jail for watching pirated movies. Meanwhile, Big Tech was the future. It wasn’t even close. I should say here that SOPA was a bad piece of legislation. And while the big tech companies had obvious business interests in opposing it, there’s little question it also simply offended the industry’s basic assumptions and values. It wasn’t entirely self-interested.

What was clear to anyone who was watching was that big tech had finally arrived as a player in Washington, DC. Of course, the tech industry had been around for two or four or five decades by that point. So it wasn’t like they’d never shown their face in policy discussions in the capital. But it’s difficult to overstate how small a presence tech had up until this point, how few lobbyists they employed, how limited their corporate political giving was. The exception was Microsoft. But of course they were a much more mature company and they’d tangled with the federal government over anti-trust in the 90s. What we’re talking about here is the modern tech industry, mostly based in Silicon Valley. The reigning idea there was that the federal government was a sclerotic, pre-tech-boom thing which they mostly didn’t care about. They weren’t worried about getting it off their backs because it couldn’t catch up with them. They were creating the future. They didn’t need subsidies or help. To a significant degree the people who ran Washington, DC agreed. The tech folks were building the future. They worked on things that existed in a virtual world that didn’t have a lot of sharp edges and mostly it was all super cool anyway.

This was part of the attraction for Democrats in raising money from the tech world — which was mainly individual rather than corporate, a key detail. They’re super rich, throw great parties and you don’t have to feel bad because there’s no labor unrest and they’re not dumping any toxic chemicals and they support gay rights and other good things.

You could of course find exceptions to this — legislation tech pushed for, candidates who were supported. But big picture this was undeniably true. In the cases where major legislation impacted tech — stuff like the Communications Decency Act of 1996 — the guiding strategy was to impact the booming tech industry as little as possible. Get out of the way while they’re building the future.

The point of the whole SOPA saga was that it showed big tech’s vast latent power. Big Tech’s credibility with the public was almost limitless. And it’s scale of wealth, if it chose to use it in the political arena, could overwhelm that of most other industries. And over the last dozen years that’s just what happened. Big Tech firms, including especially Facebook and Google, but many others at smaller scales, became big, big players not only in conventional lobbying but also in reputational advertising. This was because a lot of the secondary effects of the rise of the social platforms were beginning to crystallize in the public mind.

This was a period in which the focus on wealth inequality and the light tax touch applied to Big Tech became a big deal; people became more aware of the role of devices creating a generation of device addicts; then there was online privacy; then there was the 2016 election and scrutiny of the way social platforms were vehicles for foreign and domestic election subversion. And then there was anti-trust, a genuine existential threat to the whole word of Silicon Valley or at least the big companies which dominated it. I guess you could say well, Democrats shouldn’t have been so focused on antitrust, wealth inequality, foreign subversion campaigns targeting their candidates or online hate groups — but somehow that doesn’t seem terribly realistic to me.

I could list off a dozen more lines of scrutiny into the big platforms. But you could summarize all of them by noting that the big companies became super big, holders of an unimaginable amount of wealth, and because of their dominating bigness created what economists call “externalities” which affected society in what many found to be negative ways. That’s why lobbying and influence spending in DC began to grow so rapidly. Public scrutiny and criticism drives government scrutiny and regulation. And those are a problem for any industry.

And there was a key additional factor. Just as these companies were coming under greater scrutiny they were also becoming mature businesses. Which is to say that they were no longer growing at phenomenal pace every year. Growth was increasingly coming not from growing user bases or starting entirely new business lines but by tightening their dominant position in and extracting more profits from existing ones. That brings us back to wealth inequality, anti-trust and a bunch else.

There’s probably some role here played by the post-#MeToo/BLM world of online activism, some of which degenerates into a kind of online language policing — a significant chunk of which actually plays out in Silicon Valley as a battle between management and employees. But big picture it was inevitable that Silicon Valley would get a lot more involved in Washington, DC and just about as inevitable that Silicon Valley’s corporate power would align more and more with Republicans, or at least in ways that were less and less distinct from other industries.

There’s an argument that Democrats, with such a strong position in the world of Big Tech, should have been more strategic in not letting its policy agenda spoil that relationship. On the margins that may be right. And my point here isn’t even to dig into the question of who is at fault, Big Tech or annoying libs? It’s entirely common for political parties to look out for the interests of major constituencies. Perhaps Democrats could have done that better in some cases. My point is more that you can’t even meaningfully ask those questions in any realistic way without first recognizing the evolving political and business context of the last dozen years. That’s the big story, and it’s one of a pretty organic and likely unavoidable progression as Big Tech was increasingly dominated by only a small handful of major players. Once you get that broader progression you can make arguments about this or that event or decision along the way. But not before.

The planetary fix

Heron standing on a small patch of greenery surrounded by calm water with a lush forest in the background.

Despite decades of inaction we can avert the climate Hellocene and restore the atmosphere to keep our world habitable

- by Rob Jackson

Read at Aeon

What should I ask Greg Clark?

Yes, I will be having a Conversation with him.  Gregory Clark the economist.  Here is his Wikipedia page.  So what should I ask?

The post What should I ask Greg Clark? appeared first on Marginal REVOLUTION.

       

Comments

 

Should economists read Marx?

By Velvet, CC BY-SA 4.0, via Wikimedia Commons

The other day, Northwestern University economics professor Ben Golub tweeted the following:

Mount Holyoke English professor Alex Moskowitz responded to the revelation that most economists don’t read Smith and Marx by calling economics “fake”, declaring that it “hasn't properly historicized it's own methods of knowledge production”:

Is Moskowitz right? Should economists all be required to read, “work through”, and understand Adam Smith and Karl Marx? Is the discipline “fake” because most haven’t done this?

First of all, it’s important to point out that studying history of thought isn’t always useful. Just as doctors usually don’t study the works of Galen, and physicists usually don’t read Isaac Newton, economists don’t really have to read the original works of Alfred Marshall to understand supply and demand, or read John Nash’s original papers to understand game theory. The most useful concepts in science stand alone, divorced from the thought process of their originators. This is why they’re so powerful — anyone can just pick up Newton’s Laws or Nash Equilibrium and just use them to solve real-world problems, without knowing where those tools came from.

Imagine you’re an economist working for Amazon, using game theory to design online markets. Now imagine some English professor at a liberal arts college shouts at you that your whole field is “fake” because you haven’t read Marx. I imagine that experience would be a bit surreal.

But for now let’s set aside the question of whether and when economists should study the history of economic thought, and point out that in fact, they do study it — just not in the way that Alex Moskowitz might prefer.

When I was an economics PhD student, I was assigned a whole bunch of old foundational papers that were influential in framing modern economic thinking. I’ll just list four examples to illustrate what the economics canon is actually like:

1. “An Exact Consumption-Loan Model of Interest with or without the Social Contrivance of Money”, by Paul Samuelson (1958)

In economics, there’s an important kind of model called the “overlapping generations” or OLG model, which is basically a model of how old people, young people, and middle-aged people interact in the economy. You can think about a lot of economic phenomena in terms of those generational interactions.

For example, young people generally have to borrow to get started in life — student loans, starter homes, cars, and such. People work and save when they’re young and middle-aged, and then have to spend down their wealth during retirement. This creates some interesting interactions, because the old people can only consume by selling their accumulated assets — houses, stocks, etc. — to the young and middle-aged people.

Paul Samuelson wasn’t the first to think about this, but he was the first to formulate it in a really simple mathematical model that a lot of people could work with — and which is still commonly used today. In this paper, he showed that if you had rapid population growth, you could run into problems — you could have so many young people that they needed to borrow more than the small older generation could lend them. In this case, the best thing to do would be to transfer money from each young generation to each old generation in turn, so they’d have enough money to lend each other.

You may recognize this as the basic idea behind Social Security.

2. “The Pure Theory of Public Expenditure”, by Paul Samuelson (1954)

One of the most important concepts in public economics is the idea of a public good — something that the private sector won’t provide enough of on its own, and so which the government ought to provide (if it can). Paul Samuelson was not the first to think about this general concept either, but like with the OLG model in the previous example, he was the first to show a mathematical example of how this might work.

In this paper, he shows that if something is nonrival (if one person using something doesn’t stop someone else from using it) and nonexcludable (if you can’t prevent people from using it) — then the private sector won’t build enough of it. The classic example is a lighthouse — one ship using a lighthouse doesn’t necessarily prevent others from using it, since everyone can see the light, and you can’t really stop any particular ship from using it. So building a lighthouse is a dicey investment for any private company — you’re basically encouraging a whole bunch of free riders.

Whether the solution is to have government step in and build the lighthouse, or if there’s some private arrangement that could work just as well, is the subject of perennial debate within the economics profession. And whether you actually need both nonrivalry and nonexcludability in order to get some of the key features of public goods is another open question. But it was this original paper by Samuelson that pretty much created the whole literature on public goods, so its influence is hard to overstate.

3. “The market for lemons”, by George Akerlof (1970)

Anyone who has bought a new car knows that when you drive it off the lot, the resale price instantly drops far below the purchase price. Why? It’s the same car that it was an hour ago! The most likely answer is that if you try to turn around and sell a car right after buying it, people will assume something is wrong with it.

This insight was the basis for Akerlof’s paper. It’s about how markets can naturally break down due to asymmetric information — things that sellers know that buyers don’t. In the case of used cars, the process is called “adverse selection” — meaning that sellers want to sell low-quality stuff for more than it’s really worth, by concealing how crappy it is. Using some simple mathematical examples, Akerlof showed how adverse selection could cause trade to break down completely — buyers won’t pay full price for used cars because they might be getting sold a “lemon”, so used car dealers keep their high-quality cars off of the market entirely.

How do you solve the lemon problem? One way is to pay for mechanics to check whether a car is in good condition, but that costs money. Another way might be for the government to pass laws requiring used car dealers to tell prospective buyers important information about a car’s quality.

There’s an obvious application to health insurance, too. Adverse selection can also happen when a buyer conceals information from a seller; insurance customers will naturally try to hide how sick they are from an insurer, in order to get a lower premium. This means that healthy people have to pay premiums that are too high, which keeps them out of the market. Laws like the Affordable Care Act (Obamacare) that penalize people for not buying health insurance are aimed at preventing the exit of healthy people from the market, based on exactly the kind of principle Akerlof talks about in his paper.

4. “Uncertainty and the Welfare Economics of Medical Care”, by Kenneth Arrow (1963)

This one is interesting because although the author was famous for mathematical economics, this paper itself involves almost no math — it’s really just an essay making various logical arguments. Arrow is trying to explain why health care — including health insurance — isn’t like other markets. He basically just lists a bunch of reasons why health care is different. These reasons include:

  • Health insurance is subject to a lot of information asymmetry — the aforementioned adverse selection, plus “moral hazard” (ie., people with more insurance might be more reckless).

  • Health care involves extreme risks, including the risk of death. (Arrow implies, but doesn’t state, that both patients and providers may not be good at making rational decisions under that kind of extreme uncertainty.)

  • Humans have strong moral norms around health care — we tend to believe that basic medical care is a universal a human right, that doctors shouldn’t act like profit-seeking businesspeople, a moral disgust at the idea of forcing people to pay for health care before they receive it, and so on.

  • Things like communicable diseases cause externalities — if one person gets sick, it puts other people in danger.

  • Increasing returns to scale and restrictions on the entry of new health care suppliers create barriers to entry inhibit competition.

  • Medical providers regularly practice price discrimination, charging people different prices based on their ability to pay.

All of these factors make the market for health care an absolute mess compared to most markets, which is why the industry tends to be so heavily regulated, and is probably why so many rich countries just go ahead and nationalize their health insurance systems.

Anyway, there are four examples of the foundational economic thoughts that many (most?) PhD students in modern econ departments will be assigned. These papers produced both ideas and methods that are still central to economics research today — in Moskowitz’s terms, reading papers like these is how economists “historicize their methods of knowledge production”.

This is in contrast to the ideas of Karl Marx, which have mostly fallen by the wayside.

Is that because modern economists are neoliberal priests of the free market, who reject government intervention in favor of the power of markets? No, of course not. Every single one of the papers I just cited is about how markets fail, and about how government intervention is needed as a result of those failures. The papers don’t rely on Marxist concepts like the labor theory of value, alienation, exploitation, commodity fetishism, or the inevitable collapse of capitalism.1 But the world of economic ideas is not defined by a one-dimensional axis between Marxism and neoliberalism — there are plenty of problems with markets that Marx never even thought about.

I don’t know how much economics Alex Moskowitz has studied, but my bet is that he doesn’t actually know a great deal about the foundational economic thought of Paul Samuelson, Kenneth Arrow, or George Akerlof — or about modern econ research in general. So why does he feel qualified to declare that Marx should be viewed as part of the discipline’s foundational canon?

Part of the reason, of course, is that Moskowitz personally likes and values Marx’s ideas. He has done research relating Marx’s ideas to those of other leftist philosophers, and he teaches classes on Marx as well. It’s natural that Moskowitz would want economists to study a thinker he likes.

In a similar vein, I might urge English professors to view Ursula K. LeGuin as one of their foundational intellectuals, because I like Ursula K. LeGuin. In fact, my suggestion might be justified, and some English profs do teach LeGuin. But because I haven’t done an English PhD or existed inside of humanities academia, my suggestion would be that of an amateur outsider. (And I would probably make the suggestion with a little more playfulness and a little less aggression than Moskowitz uses in his comments about economics.)

Another reason Moskowitz might demand that economists include Marx in their canon is that Marx wrote in a literary, discursive, non-mathematical style that Moskowitz, being a humanities scholar, is able to understand (or at least, to more easily convince himself that he understands). Samuelson, Arrow, and Akerlof expressed many of their ideas in the language of mathematics, which Moskowitz, given his educational background, probably doesn’t understand very well. Prioritizing research you’re equipped to understand over that which is opaque to you is a natural human reaction, but it’s a form of the streetlight problem.

So I think the salient question here isn’t “Should Marx be part of the foundational economics canon?”, but rather, “Why should an English professor feel qualified to decide who should be part of the foundational economics canon?”.

And of course the answer here is probably going to be “politics”. I don’t want to put words in Moskowitz’s mouth, of course, but he does seem like a sort of a leftist fellow:

In my experience, many leftist academics in the humanities and social sciences see non-STEM academia as a single unified enterprise — not a collection of knowledge-seeking efforts in different domains, but a single activist political struggle against capitalism, settler colonialism, white supremacy, and so on. In this cosmology, economists are acceptable if and only if they revere the econ-adjacent thinkers whose ideas most closely dovetail with the leftist activist struggle — e.g., Karl Marx.

And in fact, I think this is the most important reason economists should read Marx. His vision of history as a grand revolutionary struggle is a cautionary tale of what can happen when pseudo-economic thought is applied too cavalierly to political and historical questions.

One economist who has read Marx, and who has thought deeply about what he wrote, is Brad DeLong. In a 2013 post, DeLong tried to explain what he thought Marx got right and what he got wrong in terms of his economic thought:

Marx the economist had six big things to say, some of which are very valuable even today across more than a century and a half, and some of which are not…

Marx…was among the very first to recognize that the fever-fits of financial crisis and depression that afflict modern market economies were not a passing phase or something that could be easily cured, but rather a deep disability of the system…

Karl Marx was among the very first to see that the industrial revolution…opens the possibility of a society in which we people can be lovers of wisdom without being supported by the labor of a mass of illiterate, brutalized, half-starved, and overworked slaves…

Marx the economist got a lot about the economic history of the development of modern capitalism in England right--not everything, but he is still very much worth grappling with as an economic historian of 1500-1850. Most important, I think, are his observations that the benefits of industrialization do take a long time--generations--to kick in…

[But] Marx believed that capital is not a complement to but a substitute for labor…Hence the market system simply could not deliver a good or half-good society but only a combination of obscene luxury and mass poverty. This is an empirical question. Marx's belief seems to me to be simply wrong…

Marx…thought [that] people should view their jobs as…ways to gain honor or professions that they were born or designed to do or as ways to serve their fellow- human…The demand for a world in which people do things for each other purely out of beneficence rather than out of interest and incentives leads you down a very dangerous road, for societies that try to abolish the cash nexus in favor of public- spirited benevolence do not wind up in their happy place.

Marx believed that the capitalist market economy was incapable of delivering an acceptable distribution of income for anything but the briefest of historical intervals. …But "incapable" is surely too strong…[S]ocial democracy, progressive income taxes, a very large and well-established safety net, public education to a high standard, channels for upward mobility, and all the panoply of the twentieth-century social- democratic mixed-economy democratic state can banish all Marx’s fears that capitalist prosperity must be accompanied by great inequality and great misery.

This summary, which seems eminently fair to me, establishes Marx as a peripheral, mildly interesting economic thinker — a political philosopher who dabbled in economic ideas, perceiving some big trends but getting others badly wrong, and ultimately leaving little mark on the field’s overall methodology or basic concepts. (Also see Brad’s slides, video, and other commentary.)

But it’s not actually for his economic ideas that we remember Marx — it’s for his political philosophy of class warfare and revolution. And here, I think DeLong has appropriately scathing things to say:

Large-scale prophecy of a glorious utopian future is bound to be false…The New Jerusalem does not descend from the clouds…But Marx clearly thought at some level that it would…

[Marx thought that] social democracy would inevitably collapse before an ideologically-based right-wing assault, income inequality would rise, and the system would collapse or be overthrown…But I think this, too, is wrong…

Add to these the fact that Marx's idea of the "dictatorship of the proletariat" was clearly not the brightest light on humanity's tree of ideas, and I see very little in Marx the political activist that is worthwhile today.

Everywhere any variant of Marx’s vision of proletarian revolution has been tried, it has been not just a failure, but an epic human tragedy. Here’s what I wrote for Bloomberg back in 2018:

[I]t’s hard to forget the tens of millions of people who starved to death under Mao Zedong, the tens of millions purged, starved or sent to gulags by Joseph Stalin, or the millions slaughtered in Cambodia’s killing fields. Even if Marx himself never advocated genocide, these stupendous atrocities and catastrophic economic blunders were all done in the name of Marxism…20th-century communism always seem to result in either crimes against humanity, grinding poverty or both. Meanwhile, Venezuela, the most dramatic socialist experiment of the 21st century…is in full economic collapse.

This dramatic record of failure should make us wonder whether there was something inherently and terribly wrong with the German thinker’s core ideas. Defenders of Marx will say that Stalin, Mao and Pol Pot exemplified only a perverted caricature of Marxism, and that the real thing hasn’t yet been tried. Others will cite Western interference or oil price fluctuations…Some will even cite China’s recent growth as a communist success story, conveniently ignoring the fact that the country only recovered from Mao after substantial economic reforms and a huge burst of private-sector activity.

All of these excuses ring hollow. There must be inherent flaws in the ideas that continue to lead countries like Venezuela over economic cliffs…The brutality and insanity of communist leaders might have been a historical fluke, but it also could have been rooted in [Marx’s] preference for revolution over evolution…[O]verthrowing the system has usually been a disaster. Successful revolutions tend to be those like the American Revolution, which [keep] local institutions largely intact. Violent social upheavals like the Russian Revolution or the Chinese Civil War have, more often than not, led both to ongoing social divisions and bitterness, and to the rise of opportunistic, megalomaniac leaders like Stalin and Mao.

As I note in that post, the successful examples of “socialism” that people cite — the Scandinavian societies of today — are actually social democracies. They achieved their mixed economies through a slow evolutionary process that was absolutely nothing like the revolutionary upheavals predicted and advocated by Marx.

Economists should read Marx, and they should read him with all of this history in mind. It’s a vivid reminder of how social science ideas, applied sweepingly and with maximal hubris to real-world politics and institutions, have the potential to do incredible harm. Marxism is perhaps the single greatest example of social-science malpractice that the human race has ever seen.

This should serve as a warning to economists — a reminder of why although narrow theories about auctions or randomized controlled trials of anti-poverty policies might seem like small potatoes, they’re not going to end with the skulls of thousands of children smashed against trees. Modern economics, with all of its mathematical formulae and statistical regressions, represents academia appropriately tamed — intelligence yoked to the quotidian search for truth, hemmed in by guardrails of methodological humility. The kind of academia that Alex Moskowitz represents, where the study of Great Books flowers almost instantly into sweeping historical theories and calls for revolution and war, embodies the true legacy of Marx — something still fanged and wild.


Subscribe now

Share

1

As it happens, I have read Marx’s Das Kapital. However, it was when I was an undergrad physics major, so I wasn’t really thinking about it in terms of its relation to modern economic theory. Also, like most German philosophy of the time, I found it both pointlessly dense and frustratingly vague.

A Cartographer’s Tale

Miguel García Álvarez of Mapas Milhaud has been writing a Spanish-language newsletter about maps for nearly two years. Now he’s started a newsletter in English: A Cartographer’s Tale. He writes: “In this newsletter, I will… More

Using Free Let's Encrypt SSL Certificates in 2025

In this article I'm going to review the steps you need to take to obtain an A+ SSL security rating for your website, as mine has.

SSL

This tutorial applies to any hosting solution that uses Nginx as web server or reverse proxy, running on a Debian based distribution of Linux such as Ubuntu. The SSL Certificate provider that I use is Let's Encrypt, which is trusted by all major web browsers and issues certificates for free.

Karen Wynn Fonstad’s Belated NYT Obituary

Karen Wynn Fonstad, the cartographer of fantasy worlds best known for her Atlas of Middle-earth, died in March 2005 aged 59. Nearly twenty years later, she gets a comprehensive obituary in the New York Times,… More

The circulation of elites, sort of

Is the top tail of wealth a set of fixed individuals or is there substantial turnover? We estimate upper-tail wealth dynamics during the Gilded Age and beyond, a time of rapid wealth accumulation and concentration in the late 19th and early 20th centuries. Using various wealth proxies and data tracking tens of millions of individuals, we find that most extremely wealthy individuals drop out of the top tail within their lifetimes. Yet, elite wealth still matters. We find a non-linear association between grandparental wealth and being in the top 1%, such that having a rich grandparent exponentially increases the likelihood of reaching the top 1%. Still, over 90% of the grandchildren of top 1% wealth grandfathers did not achieve that level.

That is from a new NBER working paper by Priti Kalsi and Zachary Ward.

The post The circulation of elites, sort of appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

The Fires in L.A.

Above two pics of beachside homes shot in pre-dawn hours on Malibu Road last March, on my way to Baja

I’m heartsick.

Although I’m a tried-and-true Northern Californian, I’ve always loved LA.

My best friend and roommate in college (Stanford) was Richard Zanuck, son of 20th Century Fox chief Darryl Zanuck, and their house was at 546 Palisades Beach Rd., right on the beach.

We got into the habit of taking off for LA in Dick’s (studio-owned) Ford convertible when things got dull in Palo Alto, driving all night and getting into LA before sunrise.

I’ll never forget my first time in LA, as we got into Malibu at dawn, Wolfman Jack ws playing Loop de Loop Mambo by the Clovers.

Wow! Things were looser than in NorCal. Music was better (at least the kind I liked). There was also Dick “Huggy Boy” Hugg playing all those great R&B songs. This San Francisco boy was loving LA.

We’d always visit surfboard maker (and later legendary) Dale Velzy at his Malibu shop. In those days, surfboards were made of balsa wood. I remember a 15-year-old gremlin, Mickey Munoz, hanging around the shop.

Surfrider Beach, with its easy-to-make perfect waves, pic I shot in 2012

The weather was warmer. The water was warmer. The girls were friendlier. Things were more relaxed, as they are the farther south you get — anywhere, for that matter — I figure Santa Cruz is about 10% LA, Santa Barbara 75% LA.

Two of my other best friends were also Angelenos: Spike Bullis in the ‘50s, and, Bob Easton in the ‘60s and ‘70s. (Bob and I combined our Northern and Southern California design sensibilities in producing the book Shelter in 1973.)

My first serious girlfriend (Sandy) was in LA. The first ride I got on a surfboard was at the Malibu colony (11-foot balsa and redwood that weighed like 60 pounds). I first heard the Black singing groups on LA radio: The Clovers, The Robins, Clyde McPhatter and The Drifters, The Flamingos, The Platters, The Contours…

My M. O. in the dozens of times I’ve visited LA since, has been to head to the Malibu colony, then go north on Malibu Road to the end, where there‘s an isolated beach. I’d typically get there at dawn and go for a swim and, when I was a competitive runner, run along the beach the 3 miles to the pier at Surfrider Beach and back. A great way to start the day in LA.

Sometimes I’d go up Corral Canyon, just north of Malibu, then to the parking lot in Solstice Canyon, and walk up the creek to a waterfall. Pretty amazing, just a few miles from the coast highway and you were in a lush, green, tropical arroyo.

Live From California with Lloyd Kahn is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

LA is visual

The signs are bigger, the public art more outrageous. Once I saw a Karmann Ghia on the freeway that had been dented in in the front; instead of repairing it, the owner had painted lips around the dent, so it looked like a fish puckering — LA.

In Venice — LA graffiti artists have just GOT it.

Big, Bold Signs in LA:

Some of the Things I’ll Miss:

Aboe two pics: The Reel Inn Fresh Fish Market in Malibu. Great seafood, draft beer, sit outside, surfer hangout, 2 fish tacos, shrimp cocktail, dark draft, $17 in 2012
The Topanga Ranch Motel (adjacent to The Reel Inn — see above — was built in the ‘20s and afforded low-cost seaside accomodation for decades until it closed in 2004. The State Parks owns the property and word was that they were going to fix up the little cabins. Gone forever..
Beachside Malibu house
There ARE good architects here and there.

I’m posting these few photos of the LA/Malibu as I knew it. You can see the present day fiery nightmare headlining in all the media right now. It’ll never be the same or even close to it — not in my lifetime or ever.

With all the natural disasters going on worldwide, I can’t help but think that the planet, a living, breathing, arguably conscious entity, has taken enough abuse and is striking back at the perpetrators of its ongoing destruction.

Hasta la vista, baby.

Share

In Praise of California

One of the unwritten rules of American politics is that it’s OK to sneer at and smear our big cities and the people who live in them, while it’s an outrageous act of disrespect to suggest that there’s anything wrong with the Heartland. And many people believe the smears; visitors to New York are often shocked to find that one of the safest places in America isn’t the hellscape they were told to expect.

These delusions of dystopia are sometimes funny, but they can have real consequences. As you read this, much of America’s second-largest city is an actual hellscape. But many politicians, from the president-elect on down, are showing zero sympathy, insisting that California — which in its own way gets trash-talked as much as New York —somehow brought this disaster on itself by being too liberal, too woke, or something. And this lack of sympathy may translate into refusal to provide adequate disaster aid.

Somehow I doubt that Florida will get the same treatment when (not if) it has its next big natural disaster. (The Biden administration responded with complete, unconditional support to regions hit by Hurricane Helene and other storms, although that hasn’t stopped Republican politicians, like Governor Bill Lee of Tennessee, from lying and claiming that aid was delayed.)

At a fundamental level the case for helping California get through this is moral: Americans should help Americans in their hour of need. But this also seems like a good time to remind people just how much the Golden State contributes to American greatness.

Before I get there: Yes, California has problems, some of them big. There are pockets of social disorder, although the fact that so many luxury homes are burning tells us that many people who could live anywhere find greater Los Angeles a highly desirable place to be. More important, California suffers terribly from NIMBYism, which has led to grossly inadequate home construction, crippling housing costs and a lot of homelessness.

But California is nonetheless an economic and technological powerhouse; without it America would be a lot poorer and weaker than it is.

Most narrowly, at a time when Donald Trump is making nonsensical claims that America is subsidizing Canada via our bilateral trade deficit, California is literally subsidizing the rest of the United States, red states in particular, through the federal budget.

The Rockefeller Institute regularly calculates states’ balance of payments — the difference between the amount the federal government spends in a state and the amount the state pays in federal taxes. Here’s what per capita balances looked like in 2022, the most recent year available (blue means a state receives more than it gives, orange the reverse):

California paid in a lot more than it got back — $83 billion in total. So did Washington state and much of the Northeast. Most red states were in the reverse position, getting much more from DC than they paid in return. And yes, it’s ironic that states that are so dependent on transfers from other states — if West Virginia were a country, it would in effect be receiving foreign aid equal to more than 20 percent of its GDP — vote overwhelmingly for politicians trying to eviscerate the programs they depend on.

Even some Republicans have noticed how blue states subsidize red states — here’s a New York Republican lashing out at South Carolina.

Now, for the most part this cross-subsidization doesn’t reflect political favoritism. Remember, the federal government is an insurance company with an army, and while military spending has some regional tilt, health and retirement spending per capita across most states is roughly the same. I’m not going to redo the numbers, but here’s a chart I made a few years back, with 2016 data. It shows that the amount of federal spending per capita in a state is almost unrelated to the state’s income, but federal receipts are much higher in richer states, so rich states subsidize poorer states:

California, in particular, pays a lot in federal taxes because it’s so much richer and more productive than most of the rest of America. Here’s real GDP per capita in 2023 for selected states and groupings of states:

I included Ohio because on Friday an Ohio congressman declared that California shouldn’t receive disaster relief until it changes its forestry management (are there forests in Los Angeles?) He probably doesn’t know that Ohio is, in effect, heavily subsidized, year after year, by California.

High productivity in California (and New York, also included) plays a significant role in making America richer; the nation excluding these powerhouses would have about 6 percent lower GDP per capita.

California makes an especially large contribution to U.S. technological dominance. As I noted a month ago, 8 of America’s top 9 technology companies — all of them if you count pre-Cybertruck Tesla — are based either in Silicon Valley or in Seattle.

And while Hollywood doesn’t dominate films and TV the way it once did, Los Angeles still plays a major role in America’s cultural influence (and still generates a lot of income.)

So how should we think about the disaster in Los Angeles? As far as I can tell, there’s nothing either the city or the state could have done to prevent it. There’s a good case to be made that we should never have allowed a huge metropolitan area to emerge in a place that was vulnerable to Santa Ana-fed firestorms even before climate change vastly increased the risks. And of course we should have begun acting to limit climate change decades ago.

But this is all hindsight, with no relevance to where we are now — which is that an American city and an American state desperately need all the help we can deliver. It shouldn’t matter whether they’ve earned it. If the United States of America doesn’t take care of its own citizens, wherever they live and whatever their politics, we should drop “United” from our name. As it happens, however, California — a major driver of U.S. prosperity and power — definitely has earned the right to receive help during a crisis.

Unfortunately, it looks all too possible that essential aid will be held up or come with onerous strings attached. If so, shame on everyone responsible.

MUSICAL CODA

So many choices, from the Mamas & the Papas to the Eagles and beyond. But here’s something more contemporary.

Tuesday: PPI

Mortgage Rates From Matthew Graham at Mortgage News Daily: Mortgage Rates Slightly Higher to Start New Week
The more important consideration is the new round of potential volatility on the horizon. Whereas it was the jobs report last week, this week's critical data will be Wednesday's Consumer Price Index (CPI). Tomorrow's inflation data (the Producer Price Index) is not quite as important, but a nonetheless capable of causing a reaction.

If inflation comes in higher than expected, it could easily push rates even higher. [30 year fixed 7.26%]
emphasis added
Tuesday:
• At 6:00 AM ET, NFIB Small Business Optimism Index for December.

• At 8:30 AM, The Producer Price Index for December from the BLS. The consensus is for a 0.4% increase in PPI, and a 0.3% increase in core PPI.

Monday 13 January 1661/62

All the morning at home, and Mr. Berkenshaw (whom I have not seen a great while, came to see me), who staid with me a great while talking of musique, and I am resolved to begin to learn of him to compose, and to begin to-morrow, he giving of me so great hopes that I shall soon do it.

Before twelve o’clock comes, by appointment, Mr. Peter and the Dean, and Collonel Honiwood, brothers, to dine with me; but so soon that I was troubled at it. But, however, I entertained them with talk and oysters till one o’clock, and then we sat down to dinner, not staying for my uncle and aunt Wight, at which I was troubled, but they came by and by, and so we dined very merry, at least I seemed so, but the dinner does not please me, and less the Dean and Collonel, whom I found to be pitiful sorry gentlemen, though good-natured, but Mr. Peter above them both, who after dinner did show us the experiment (which I had heard talk of) of the chymicall glasses, which break all to dust by breaking off a little small end; which is a great mystery to me. They being gone, my aunt Wight and my wife and I to cards, she teaching of us how to play at gleeke, which is a pretty game; but I have not my head so free as to be troubled with it. By and by comes my uncle Wight back, and so to supper and talk, and then again to cards, when my wife and I beat them two games and they us one, and so good night and to bed.

Read the annotations

Empowering Communities Through Advanced Health Education

Health education serves as a powerful tool for shaping resilient communities that can tackle pressing public health challenges effectively. In an era marked by increasing health disparities, emerging diseases, and the demand for accessible healthcare, fostering health literacy is more important than ever.

Advanced health education plays a transformative role by equipping individuals with the skills and knowledge needed to address systemic issues, promote preventative care, and improve health outcomes.

This article explores how health education empowers communities, emphasizing its role in addressing social determinants, building resilience, and advancing preventative healthcare practices.

The Role of Specialized Programs in Public Health Education

Advanced health education programs provide critical pathways for individuals to develop the skills necessary to make meaningful contributions to their communities. For example, masters of public health programs online are designed to offer comprehensive training in areas such as epidemiology, health policy, and biostatistics.

These programs not only bridge the gap between academic learning and practical application but also make advanced education accessible to individuals in remote or underserved regions.

Graduates of such programs often emerge as leaders in public health initiatives, addressing local challenges with tailored solutions. For instance, they may use their knowledge to advocate for policies that improve health equity or develop community-based strategies to combat chronic diseases.

By fostering a deeper understanding of public health principles, these programs empower professionals to create long-term, sustainable improvements in health outcomes.

Strengthening Community Engagement Through Education

Effective community engagement is a cornerstone of public health success. Health education empowers individuals to become active participants in their community’s well-being, creating a ripple effect that extends far beyond individual actions. When residents are equipped with accurate health information, they are more likely to advocate for initiatives that promote wellness and prevent disease.

For example, local workshops and seminars can educate residents about topics like nutrition, mental health, and vaccination. These sessions create a sense of shared responsibility and encourage community-wide participation in health campaigns. Educated individuals can also act as liaisons between healthcare providers and their communities, ensuring that public health messages are culturally relevant and accessible.

Additionally, health education fosters collaboration among various sectors, such as schools, workplaces, and local governments. This interconnected approach strengthens the social fabric, enabling communities to address challenges more effectively. By promoting a culture of health and wellness, education empowers individuals to take ownership of their community’s future, creating a foundation for sustainable development.

Addressing Social Determinants of Health

The social determinants of health, such as income, education, and housing, play a significant role in shaping health outcomes. Advanced health education provides communities with the tools and strategies needed to address these underlying factors. By focusing on prevention and systemic change, health education becomes a powerful mechanism for reducing disparities and promoting equity.

For instance, public health professionals trained through advanced programs can implement initiatives to combat food insecurity, such as establishing community gardens or organizing healthy food drives. Similarly, education equips leaders to advocate for policies that improve access to affordable housing and healthcare services. By addressing these root causes, communities can create environments that support health and well-being for all residents.

Advancing Preventative Healthcare Practices

Preventative healthcare is a vital component of community well-being, and education plays a central role in its promotion. By emphasizing the importance of regular screenings, vaccinations, and early intervention, health education helps individuals take proactive steps to safeguard their health.

Educated community members are better equipped to recognize early warning signs of chronic diseases, seek timely medical care, and adopt healthier lifestyles. For example, public health campaigns can highlight the benefits of regular exercise, balanced diets, and stress management. These initiatives not only improve individual health but also reduce the overall burden on healthcare systems.

Preventative practices also extend to addressing environmental factors that contribute to poor health. Education empowers communities to advocate for cleaner air, safer water, and healthier living conditions. By fostering a culture of prevention, health education creates long-lasting benefits that enhance quality of life for current and future generations.

Building Resilience in Public Health Crises

The ability to respond effectively to public health crises is a testament to the strength of a community’s health education system. Advanced education equips individuals with the skills and knowledge needed to navigate emergencies, ensuring that communities remain resilient in the face of adversity.

The COVID-19 pandemic highlighted the importance of having trained public health professionals who can manage resources, coordinate communication efforts, and implement safety measures. Health education programs prepare individuals to take on these roles, enabling communities to respond swiftly and effectively to crises.

Beyond immediate responses, education also plays a role in long-term recovery and preparedness. By analyzing data, evaluating outcomes, and sharing lessons learned, public health professionals can refine strategies for future emergencies. This continuous cycle of learning and adaptation strengthens community resilience, ensuring that they are better prepared for whatever challenges may arise.

The Future of Community Health Education

As technology continues to evolve, the future of community health education holds exciting possibilities. Digital platforms, virtual simulations, and data analytics tools are transforming the way individuals learn and apply public health concepts. These innovations make education more interactive, accessible, and effective, reaching learners in even the most remote areas.

Culturally responsive teaching methods are also gaining prominence, ensuring that health education resonates with diverse communities. By incorporating local traditions, languages, and values, these approaches foster greater understanding and engagement. This inclusivity strengthens the impact of health education, creating a unified approach to public health challenges.

Looking ahead, the integration of public health education into broader societal frameworks—such as schools, workplaces, and local governments—will further amplify its impact. By prioritizing health education as a community-wide initiative, societies can build a future where every individual has the knowledge and resources needed to lead a healthy life.

All in all, empowering communities through advanced health education is a powerful strategy for addressing today’s public health challenges. From promoting preventative care to addressing social determinants, education fosters resilience, equity, and well-being.

As communities continue to embrace the transformative power of health education, they pave the way for a healthier, more equitable future.


CLICK HERE TO DONATE IN SUPPORT OF DCREPORT’S NONPROFIT MISSION

The post Empowering Communities Through Advanced Health Education appeared first on DCReport.org.

Here’s How to Boost Your Brand’s Online Visibility

In today’s digital-dominated world, having a strong online presence is indispensable for brands aiming to secure a competitive edge. Visuals are a critical element of this presence, as they capture attention and communicate messages more efficiently than text alone. By integrating high-quality images, brands can significantly improve their digital interfaces, making them more engaging and appealing to potential customers. This strategy boosts aesthetics and supports brand messaging and online visibility, helping to create a memorable online experience that can attract and retain customers.

Here’s how you can get started:

Understand Your Target Audience

Understanding your target audience is foundational to any effective marketing strategy. This involves more than just knowing their age or location. It’s about grasping their behaviors, preferences, and motivations. Brands should utilize analytics tools to gather data on their audience’s online activities, preferences, and engagement patterns. This data can guide the creation of persona-driven content strategies, which tailor messages and visuals to meet the specific needs and desires of different customer segments, thereby increasing the relevance and impact of your marketing efforts.

Improve Your Visual Content

The power of visual content in digital marketing cannot be overstated. Engaging images capture attention and facilitate emotional connections with the audience. Many online platforms offer a wide range of high-quality stock photos that can dramatically improve the visual appeal of your digital assets. Utilizing such resources allows brands to maintain a visually appealing online presence without the substantial costs associated with producing original photography. Moreover, these platforms often allow you to purchase stock photos optimized for various digital formats and uses, ensuring your brand maintains a professional appearance across all platforms. Engaging visuals draw attention and encourage sharing, which can significantly extend the reach and visibility of your brand.

Build a Responsive Website

Today, a responsive website is a must for any brand that wants to remain viable in the digital space. A website that adapts fluidly across all devices ensures that all visitors have a positive experience, regardless of how they access your site. This adaptability improves user satisfaction and aids in SEO, as search engines now use mobile-friendliness as a ranking factor. Ensuring that your website is easy to navigate and responsive can increase the time visitors spend on your site and reduce bounce rates, both of which are favorable signals to search engines.

Leverage Social Media Platforms

Social media offers unparalleled opportunities for brands to build visibility and engage directly with consumers. Each platform serves different purposes and reaches distinct audiences. For example, Instagram is highly visual, making it ideal for brands that want to showcase products through images and videos. Twitter, on the other hand, is great for real-time communication and can be particularly effective for customer service and engaging in industry conversations. Regular, strategic posting on chosen platforms keeps your brand top of mind and fosters engagement. Also, utilizing tools for scheduling posts, monitoring engagement, and analyzing performance can help optimize your social media strategy for better results.

Optimize for Search Engines (SEO)

Effective SEO is critical for improving a brand’s online visibility. Beyond basic keyword integration, SEO involves structuring informative content relevant to your audience’s interests. This includes creating comprehensive, quality content that addresses the needs and questions of potential customers. SEO practices also extend to optimizing technical elements such as site architecture and URL structure, which help search engines crawl and index your site more effectively. Furthermore, building a network of backlinks from credible sites boosts your site’s authority and drives traffic, improving SEO and direct audience engagement.

Use Video Marketing to Connect with Your Audience

Video marketing is an essential tool for improving engagement and increasing brand visibility. Videos are highly engaging and can convey your message more dynamically and memorably than text-based content. From explainer videos and product demonstrations to customer testimonials and behind-the-scenes looks, video content can provide a richer, more interactive user experience. Platforms like YouTube also improve SEO and increase the reach of your content through visual searches. Brands should consider live streaming events or Q&A sessions to interact directly with the audience, adding a personal touch that can significantly boost brand loyalty and consumer connection.

Reach Out To Influencers in Your Niche

Influencer marketing can be a powerful strategy for amplifying your brand’s reach and credibility. Collaborating with influencers who align with your brand values and resonate with your target audience can allow you to tap into their established followings and gain trust rapidly. It’s crucial to choose influencers thoughtfully, focusing on those whose audiences overlap with your own target demographics. Collaborations can range from sponsored posts and endorsements to guest appearances on blogs or podcasts. Such partnerships can drive traffic, increase brand awareness, and generate social proof that encourages others to engage with your brand.

Monitor Analytics and Adjust Strategies

Data is a goldmine for optimizing your digital marketing strategies. Regularly monitoring your website and social media analytics provides insights into what works and doesn’t. Online tools and social media analytics platforms allow you to track visitor behavior, engagement rates, and conversion metrics. Analyzing this data helps you refine your marketing tactics and better allocate your resources to the most effective strategies. Continuously testing and tweaking your approaches based on real-time data can significantly improve the effectiveness of your marketing efforts.

Invest in Paid Advertising

Paid advertising is a viable method to increase brand visibility quickly. Online platforms like Facebook Ads also offer targeted advertising options that can be customized to reach specific audiences based on demographics, behaviors, and interests. Well-crafted ads with compelling calls-to-action can drive substantial traffic to your site, generate leads, and increase conversions. It’s important to start with a clear objective and a set budget, using A/B testing to identify the most effective ad formats and messages. This strategic approach ensures optimal ROI from your advertising spend.

Improving your brand’s online visibility is a multifaceted process that requires a combination of strategies, from optimizing your website for search engines to engaging with your audience through social media and video content. By understanding your audience, leveraging powerful visuals, and utilizing data-driven insights to refine your approach, you can significantly boost your digital presence. Remember, the digital world is ever-evolving, so staying adaptable and informed about the latest trends and technologies is key to maintaining and growing your online visibility. Start implementing these strategies today, and watch as your brand reaches new heights in the digital realm.


CLICK HERE TO DONATE IN SUPPORT OF DCREPORT’S NONPROFIT MISSION

The post Here’s How to Boost Your Brand’s Online Visibility appeared first on DCReport.org.

Las Vegas November 2024: Visitor Traffic Up 0.6% YoY; Convention Traffic Down 8.4%

From the Las Vegas Visitor Authority: November 2024 Las Vegas Visitor Statistics
Punctuated by the second annual F1 Las Vegas Grand Prix and the SEMA tradeshow, Las Vegas hosted more than 3.3M visitors in November, slightly up over last year (+0.6% YoY).

With a net decrease in churn of mid‐size and smaller meetings vs. last year, convention attendance was 548k for the month, down ‐8.4% YoY.

November saw higher Weekend occupancy vs. last year (89.1%, up 0.4 pts) but lower Midweek occupancy (76.9%, down ‐2.0 pts) as overall Hotel occupancy for the month reached 81.4%, down ‐0.5 pts. While down compared to the record‐shattering levels tied to last year's inaugural F1 race, monthly ADR this year saw the second highest on record for the month of November, reaching $199 (‐20.3% YoY) while RevPAR came in at approx. $162 (‐20.8% YoY).
emphasis added
Las Vegas Visitor Traffic Click on graph for larger image.

The first graph shows visitor traffic for 2019 (Black), 2020 (dark blue), 2021 (light blue), 2022 (light orange), 2023 (dark orange) and 2024 (red).

Visitor traffic was up 0.6% compared to last November.  Visitor traffic was down 3.2% compared to November 2019.

Year-to-date visitor traffic is down 5.6% compared to 2019.

The second graph shows convention traffic.

Las Vegas Convention Traffic
Convention traffic was down 8.4% compared to November 2023, and down 9.1% compared to November 2019.  

Year-to-date convention traffic is down 9.2% compared to 2019.

Principles, from Nabeel

14. You don’t do anyone any favors by lurking, put yourself out there!

15. If you don’t “get” a classic book or movie, 90% of the time it’s your fault. (It might just not be the right time for you to appreciate that thing.)

16. If you find yourself dreading Mondays, quit…

23. Doing things is energizing, wasting time is depressing. You don’t need that much ‘rest’.

24. Being able to travel is one of the key ways the modern world is better than the old world. Learn to travel well.

Here is the full list.

The post Principles, from Nabeel appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Radon

A good ²³⁸Umbrella policy should cover it.

Unoffice 1: what a joy

I did my first unoffice hours meeting today. An absolute pleasure. What a lovely way to start the week. (As Matt has said).

If you fancy a 30 min online chat, about whatever you like, book yourself a slot.

Iran is vulnerable to a Trumpian all-out economic assault

Oil prices are already at a five-month high

Monday assorted links

1. Cairo norms and multiple equilibria?

2. Straussian, clever language though not as sharp as the Everly Brothers.  I wish them the best.

3. The declining influence of economists in policy circles (NYT).

4. ChinaTalk on DeepSeek.

5. Weird things going on in Romania, poorly understood.

6. One good way to prompt o1.

7. The new AI plan from the UK.

The post Monday assorted links appeared first on Marginal REVOLUTION.

       

Comments

 

Microsoft Takes Legal Action Against AI “Hacking as a Service” Scheme

Not sure this will matter in the end, but it’s a positive move:

Microsoft is accusing three individuals of running a “hacking-as-a-service” scheme that was designed to allow the creation of harmful and illicit content using the company’s platform for AI-generated content.

The foreign-based defendants developed tools specifically designed to bypass safety guardrails Microsoft has erected to prevent the creation of harmful content through its generative AI services, said Steven Masada, the assistant general counsel for Microsoft’s Digital Crimes Unit. They then compromised the legitimate accounts of paying customers. They combined those two things to create a fee-based platform people could use.

It was a sophisticated scheme:

The service contained a proxy server that relayed traffic between its customers and the servers providing Microsoft’s AI services, the suit alleged. Among other things, the proxy service used undocumented Microsoft network application programming interfaces (APIs) to communicate with the company’s Azure computers. The resulting requests were designed to mimic legitimate Azure OpenAPI Service API requests and used compromised API keys to authenticate them.

Slashdot thread.

2nd Look at Local Housing Markets in December

Today, in the Calculated Risk Real Estate Newsletter: 2nd Look at Local Housing Markets in December

A brief excerpt:
NOTE: The tables for active listings, new listings and closed sales all include a comparison to December 2019 for each local market (some 2019 data is not available).

This is the second look at several early reporting local markets in December. I’m tracking over 40 local housing markets in the US. Some of the 40 markets are states, and some are metropolitan areas. I’ll update these tables throughout the month as additional data is released.

Closed sales in December were mostly for contracts signed in October and November when 30-year mortgage rates averaged 6.43% and 6.81%, respectively (Freddie Mac PMMS). This was an increase from the average rate for homes that closed in November, but down from the average rate of 7.5% in October and November 2023.
...
Closed Existing Home SalesHere is a look at months-of-supply using NSA sales. Since this is NSA data, it is likely this will be the seasonal low for months-of-supply.

Note the regional differences with more months-of-supply in the South, especially in Florida and Texas.
...
Many more local markets to come!
There is much more in the article.

Sonos Canned CEO Patrick Spence, Who Spearheaded Disastrous App Launch

Chris Welch, The Verge:

Sonos CEO Patrick Spence is resigning from the job today, effective immediately, with board member Tom Conrad filling the role of interim CEO. It’s the most dramatic development yet in an eight-month saga that has proven to be the most challenging time in Sonos’ history.

The company’s decision to prematurely release a buggy, completely overhauled new app back in May — with crucial features missing at launch — outraged customers and kicked off a monthslong domino effect that included layoffs, a sharp decline in employee morale, and a public apology tour. The Sonos Ace headphones, rumored to be the whole reason behind the hurried app, were immediately overshadowed by the controversy, and my sources tell me that sales numbers remain dismal. Sonos’ community forums and subreddit have been dominated by complaints and an overwhelmingly negative sentiment since the spring. [...]

But three months later, Sonos’ board of directors and Spence have concluded that those steps weren’t enough: the app debacle has officially cost Spence his job. No other changes are being made today, however. So for now, chief product officer Maxime Bouvat-Merlin, who some employees have privately told me deserves a fair share of the blame for recent missteps, will remain in his role.

If they don’t fire that rube too, Sonos is likely continuing down the path to irrelevance and bankruptcy that Spence started them down. The problem wasn’t a bad or ill-considered app rewrite. The bad app rewrite was a symptom of leadership with no appreciation for product and experience design, when Sonos’s entire raison d’être is to deliver a superior product and acoustic experience. Their customer demographic is people with great taste and high standards. Sonos is basically Apple, but just for audio, but in a market where Apple itself is a major player. Yet somehow the company wound up being run by a leadership team with no taste.

Here’s a surprise tidbit that gives me hope Conrad might be the right man for the job:

Conrad’s career includes a 10-year tenure as chief technology officer at Pandora and two years as VP of product at Snapchat. He worked on Apple’s Finder software during the ’90s. Most recently, Conrad served as chief product officer for the ill-fated Quibi streaming service.

(To be clear, I’m talking about the ’90s Finder part, not the Quibi part. The classic Finder was one of the all-time best apps ever made.)

 ★ 

Professional Democrats Do Need to Fix the Things They Can Control

When it comes to something like an election, much of the outcome is determined by events and forces that are beyond a candidate’s or party’s control. That said, professional Democrats do have agency and desperately need to make changes in the things they do control (boldface mine):

The party lacks any sort of cohesive information architecture or strategy and leaders of the party holding elected office are lackadaisical about communicating with the public – and when they do bother to speak out, the language and methods are archaic most of the time. Even if the media and social environment were perfect, the party would have problems communicating in this mess, so it’s even worse since the environment is a toxic garbage dump.

When they’re not communicating competently the party is also governing and legislating incompetently. Democrats may be America’s liberal party, but they are beholden to a conservative approach. The party is disinterested in proposing legislation designed to advance its ideological goals, preferring instead to only back legislation that has a realistic chance to pass, rather than backing bills designed to fail which would elicit friction with the other side – activating support from voters. At the same time, the party legislates as if bipartisanship is the end goal, rather than providing policies and services to help people. And then it stinks at communicating the benefits of those policies to people.

This was the party that Biden then Harris led, and it had these same problems – or some variation of them – under Presidents Barack Obama and Bill Clinton, as well as the eras when it was out of the White House and in the minority.

Acknowledging these problems and working to fix them is not undermining the party. America is (for now at least) a two-party country and it needs a competent Democratic Party to hold off the menace within the Republican Party.

It’s impossible to control everything, but professional Democrats could end the bad behavior Willis identifies–if they so chose. So far, they haven’t.

Then again, I’m old enough to remember when rank-and-file Democrats had to fight our own party to get them to protect Social Security which is the signature safety net program of the Democratic Party. It gets tiresome after a while.

At some point, even diehards are going to start wondering if their money and time (voting is pretty easy in many, not all, places) are better spent on something other than the same old Democrats.

Are private firefighters repugnant (in Los Angeles)?

 The SF Chronicle has the story:

Private firefighters protected a Hollywood talent manager’s home. Why are some people so mad?
By Matthias Gafni, Susie Neilson

"Leber is one of a growing number of Californians who, faced with the growing threat of wildfires in populated areas, have turned to private firefighting teams as an added layer of protection. Supporters of private firefighting teams argue they can augment the work of government-run efforts, stepping in to fill the cracks caused by depleted city and state budgets and an ever-worsening climate crisis.

"But not everyone is a fan of private firefighters, particularly those that contract directly with homeowners outside of insurance, like the company Leber hired. Critics contend that when wealthy individuals hire their own firefighters, they compete with public teams for precious resources such as water, and could potentially interfere with those teams’ efforts by, for example, blocking or crowding narrow access points.

"Moreover, they say, private firefighters widen the already-vast chasm between rich and poor, safeguarding the interests of the former at the expense of the latter.

"“The rich suffer zero consequences of anything, even cataclysmic natural disasters,” one user wrote on X, responding to a video the Chronicle posted showing private firefighters saving Leber’s house. “Private and firefighter should not be in the same sentence,” wrote another.

Joe Torres, CEO of All Risk Shield, thinks some of these criticisms are unfair — especially during major disasters like this one.

...

"He disputed the idea that private teams siphon water away from the public: His teams primarily bring their own to a site, or draw from homeowners’ pools."

Housing Jan 13th Weekly Update: Inventory down 1.7% Week-over-week, Up 23.6% Year-over-year

Altos reports that active single-family inventory was down 1.7% week-over-week.

Inventory will continue to decline seasonally and probably bottom in late January or February.  

The first graph shows the seasonal pattern for active single-family inventory since 2015.

Altos Year-over-year Home InventoryClick on graph for larger image.

The red line is for 2024.  The black line is for 2019.  

Inventory was up 23.6% compared to the same week in 2024 (last week it was up 27.3%), and down 23.3% compared to the same week in 2019 (last week it was down 22.2%). 

Back in June 2023, inventory was down almost 54% compared to 2019, so the gap to more normal inventory levels has closed significantly!

Altos Home InventoryThis second inventory graph is courtesy of Altos Research.

As of Jan 10th, inventory was at 624 thousand (7-day average), compared to 635 thousand the prior week. 

Mike Simonsen discusses this data regularly on Youtube.

Canada’s Comparative Advantages

Jay Martin writes a provocative post, Has Canada Become a Jamaican Bobsled Team?

if Canada were to become a state, it would be the third poorest in the country, right behind Alabama.

Everybody talks about the American debt issue, but Canadian households bear more debt relative to their income than any other G7 country. The average Canadian now spends 15% of their income on debt servicing.

This is a stark shift from 2008 when Canada emerged from the global financial crisis with a healthier balance sheet than any other G7 nation.

One indicator I pay close attention to is corporate investment per worker.

Every year, businesses invest in growth – new technology, new projects, new employees or products. If you take the total number that businesses invest during a calendar year, and divide that by the number of active workers in the country, you get the corporate investment per worker.

In the U.S., businesses invest about $28,000 per worker. In Canada, that number is only $15,000—nearly half.

Corporate investment is what drives future productivity, economic growth, and opportunity. The higher the number, the brighter the future.

What is Canada’s comparative advantage?

…we have vast natural resources, we have easy-to-navigate geography, we have the world’s longest coastline that spans three oceans – allowing direct access to every global market, and the largest shared international land border, on the other side of which is the worlds wealthiest, hungriest customer.

We have product. And we have a direct line to the consumer.

We are not capitalizing on these advantages because we have been sold a narrative discouraging investment in the industries where we outperform the world.

…The narrative that Canada should abandon its resource sector to pursue conceptual industries like hydrogen power or electric vehicle production is both misguided and damaging. These are fields where Canada has little experience or infrastructure, we are not competitive, and the evidence is in our economic data.

I would add one point. The issue shouldn’t be framed as extracting natural resources versus high-tech investment, as if mining, oil and gas, lumber and agriculture were simple brute-labor industries. In fact, there is plenty of room for artificial intelligence to dramatically increase the rate at which profitable mines are discovered. Industrial robotics and automation are the future of mining. Agriculture is a high-tech industry from genetic engineering to robotic laser weeding to satellite based based crop monitoring.

Indeed, Canada’s best chance to stay at the forefront of technology lies in exploiting its comparative advantages.

The post Canada’s Comparative Advantages appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Although it’s “insane” to try to land New Glenn, Bezos says it’s important to try

MERRITT ISLAND, Fla.—Understandably, the main building of Blue Origin's sprawling campus in Florida buzzed with activity on Sunday evening as the final hours ticked down toward the company's historic first orbital launch. The time had come to celebrate a moment long-awaited.

On one side of the large foyer hung a multi-story print of the New Glenn rocket lit up on its launch pad. The striking image had been taken a day after Christmas and put up in the lobby two days earlier. On the other side, a massive replica of the company's "Mk. 1" lunar lander towered over caterers bustling through.

My escort and I took the elevators to the upper floor, where a walkway overlooks the factory where Blue Origin builds the first and second stages of its New Glenn rocket. There I met the chief executive of the company, Dave Limp, as well as the person responsible for all of this activity.

Read full article

Comments

An icy vent line may have caused Blue Origin to scrub debut launch of New Glenn

COCOA BEACH, Fla.—With 45 minutes left in a three-hour launch window, Blue Origin scrubbed its first attempt to launch the massive New Glenn rocket early on Monday morning.

Throughout the window, which opened at 1 am ET (06:00 UTC), the company continued to reset the countdown clock as launch engineers worked out technical issues with the rocket.

Officially, both on its live webcast as well as on social media following the scrub, Blue Origin was vague about the cause of the delayed launch attempt.

Read full article

Comments

Moral resilience

A masked nurse in scrubs sitting in a hospital waiting area.

Nurses experience deep suffering when they can’t act according to their moral compass. Our research shows a way forward

- by Cynda Hylton Rushton

Read at Aeon

Chinese sea launch sends 10 navigation enhancement satellites into orbit

Sea-based launch of a Jielong-3 (Smart Dragon-3) rocket on January 13, 2025, from the Haiyang Eastern Spaceport. The rocket ascends into a clear sky, leaving a trail of flame and smoke as it lifts off from a converted platform at sea, with the support vessel visible on the right.

China launched a Jielong-3 solid rocket from a mobile sea platform late Sunday, successfully placing 10 Centispace navigation enhancement satellites into orbit.

The post Chinese sea launch sends 10 navigation enhancement satellites into orbit appeared first on SpaceNews.

Blue Origin scrubs first New Glenn launch attempt

New Glenn

Blue Origin called off the first attempt to launch its New Glenn rocket Jan. 13 because of an unspecified technical issue.

The post Blue Origin scrubs first New Glenn launch attempt appeared first on SpaceNews.

New interim leaders for NASA astrophysics and planetary science

Clampin

Retirements and reassignments have led to a reshuffling of leadership in part of NASA’s Science Mission Directorate.

The post New interim leaders for NASA astrophysics and planetary science appeared first on SpaceNews.

Why vanity could be a good thing

Illustration of a hand dropping a coin into another hand within an eye-shaped frame on a textured background.

Jean-Jacques Rousseau and Adam Smith agreed that vanity was all too human. But one saw it as a vice; the other, as a necessity

- by Aeon Video

Watch at Aeon

SpaceX launches 21 Starlink satellites on Falcon 9 rocket from Cape Canaveral

A SpaceX Falcon 9 rocket lifts off from Space Launch Complex 40 (SLC-40) at Cape Canaveral Space Force Station to begin the Starlink 12-4 mission on Jan. 13, 2025. Image: SpaceX

Update 2:37 p.m. EST (1937 UTC): SpaceX confirmed deployment of the Starlink satellites.

SpaceX kicked off a busy launch week that features flights from all four of its launch pads between California, Florida and Texas. Assuming no launch slips, it will launch three Falcon 9 rockets and the seventh flight test of its Starship-Super Heavy rocket.

First up was the Starlink 12-4 mission, which launched from Space Launch Complex 40 (SLC-40) from Cape Canaveral Space Force Station. Liftoff happened at 11:47 a.m. EST (1647 UTC).

 

The Falcon 9 first stage booster supporting the mission, tail number B1080 in the SpaceX fleet, launched for a 15th time. It previously supported the launches of the European Space Agency’s Euclid spacecraft, four missions to the International Space Station and eight Starlink flights.

A little more than eight minutes after liftoff, B1080 landed on the droneship, ‘A Shortfall of Gravitas,’ marking the 94th booster landing on ASOG and the 396th booster landing to date.

A day after launching the Starlink 12-4 mission, SpaceX is scheduled to launch the Transporter-12 rideshare mission with dozens of satellites on board. That mission will launch from Vandenberg Space Force Base in California.

Next up, back in Florida, a dual-lunar landing mission is scheduled to launch no earlier than 1:11 a.m. EST (0611 UTC) on Wednesday, Jan. 15, from Launch Complex 39A at the Kennedy Space Center in Florida. Later that day, SpaceX is scheduled to launch the Starship Flight 7 mission, where it will attempt to once again catch the Super Heavy booster at the launch tower.

The Acemoglu arguments against high-skilled immigration

Here is Daron Acemoglu’s Project Syndicate piece, mostly critical on high-skilled immigration.

Here is the first argument from Acemoglu:

…one would expect corporate America’s growing need for skilled STEM workers to translate into advocacy for, and investments in, STEM education. But an overreliance on the H-1B program may have broken this link and made American elites indifferent to the widely recognized failures of the US education system. Put differently, the problem may not be a cultural veneration of mediocrity, as Ramaswamy argued, but rather neglect on the part of business leaders, intellectual elites, and politicians.

o1 responds.  Here is Acemoglu’s second argument:

Even as H-1B workers boost innovation, their presence may affect the direction innovation takes. My own work shows (theoretically and empirically) that when the supply of skilled labor increases, technology choices start favoring such workers. Over the last several decades, businesses have increasingly adopted technologies that favor high-skill workers and automate tasks previously performed by lower-skill workers. While this trend may have been driven by other factors, too, the availability of affordable high-skill workers for the tech industry plausibly contributed to it.

o1 pro responds.

The third argument about brain drain has enough qualifications and admissions that it isn’t really a criticism.  In any case my colleague Michael Clemens, among others, has shown that the brain drain argument applies mainly to very small countries.  But if you wish, run it through AI yourself.

If all I knew were this “exchange,” I would conclude that o1 and o1 pro were better economists — much better — than one of our most recent Nobel Laureates, and also the top cited economist of his generation.  Noah Smith also is critical.

Via Mike Doherty.

The post The Acemoglu arguments against high-skilled immigration appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Jupiter’s clouds are not made of ammonia ice

Astronomers have long thought that the upper clouds of Jupiter, which create the planet’s iconic pale brown belts, are made of frozen ammonia. But a new study, that brought together amateur and professional astronomers, has shown that these clouds are actually located lower in the atmosphere than we thought and are made of something completely different: most likely ammonium hydrosulphide mixed with smog.

Citizen scientist Steve Hill previously showed that he could map the planet’s atmosphere by using only specially coloured filters and his backyard telescope. These results provided initial clues that the clouds were too deep within Jupiter’s warm atmosphere to be consistent with clouds made of ammonia ice. To check, Hill joined forces with Patrick Irwin at Oxford University, whose team had previously used the sophisticated MUSE instrument on ESO’s Very Large Telescope (VLT) to study the atmosphere of gas giants.

MUSE is capable of scanning the atmosphere of Jupiter at different wavelengths, mapping out the different molecules that make up the planet’s atmosphere. This animated image, based on real MUSE data, shows how the gas giant looks at different wavelengths.

The new study shows that this new approach with backyard telescopes or VLT/MUSE can map the abundance of ammonia in Jupiter’s atmosphere with surprising accuracy. As for clouds, the team concluded that Jupiter’s atmosphere is much like a layered cake. Clouds of ammonium hydrosulphide cover the upper layers, but sometimes there may be a decoration of ammonia ice clouds, brought to the top by strong vertical convection. The entire cake’s structure, though, is not yet fully known, and the work of citizen scientists will be key to uncovering it. So next time you are looking at Jupiter or Saturn from your backyard, you may also be unravelling the secrets still lying within our Solar System.

The Greenland debates

I would say we have not yet figured out what is the best U.S. policy toward Greenland, nor have we figured out best stances for either Greenland or Denmark. I am struck however by the low quality of the debate, and I mean on the anti-U.S. side most of all.  This is just one clip, but I am hearing very much the same in a number of other interchanges, most of all from Europeans.  There is a lot of EU pearl-clutching, and throwing around of adjectives like “colonialist” or “imperialist.”  Or trying to buy Greenland is somehow analogized to Putin not trying to buy Ukraine.  Or the word “offensive” is deployed as if that were an argument, or the person tries to switch the discussion into an attack on Trump and his rhetoric.

C’mon, people!

De facto, you are all creating the impression that Greenland really would be better off under some other arrangement.  Why not put forward a constructive plan for improving Greenland?  It would be better yet to cite a current plan under consideration (is there one?).  “We at the EU, by following this plan, will give Greenland a better economic and security future than can the United States.”  If the plan is decent, Greenland will wish to break off the talks with America it desires.  (To be clear, I do not think they desire incorporation.  This FT piece strikes me as the best so far on the debates.)

Or if you must stick to the negative, put forward some concrete arguments for how greater U.S. involvement in Greenland would be bad for global security, bad for economic growth, bad for the U.S., or…something.  “Your EU allies won’t like it,” or “Trump’s behavior is unacceptable” isn’t enough and furthermore the first of those is question-begging.

It is time to rise to the occasion.

p.s. I still am glad we bought the Danish West Indies in 1917.  Nor do I hear many Danes, or island natives, complain about this.

The post The Greenland debates appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Heartbreak in L.A.

As distressing it is to watch the wildfire devastation across Los Angeles, it has been impossible not to remember that LA was our home for two decades of child-rearing and work.

That acknowledgment makes the video of multiple lost neighborhoods and uprooted lives on streets we know yet more heartbreaking, as does the understanding that it will take years to figure out what comes next for families and the city itself.

Like many, we seek out friends from the possible danger zones and check on their safety. We’ve noted that our own neighborhood in West LA has remained  just beyond the fire boundaries but could have as easily been caught up in the windblown flames that have left block-after-block, tornado-proportions of emptiness. The place is so big that you can have both devastation and survival of large neighborhoods.

The current news recalls the feelings that came with driving on the 10 Freeway, seeing the entire western ridge aflame. It brought back heart-in-mouth anxiety in the Los Angeles Times newsroom about sending photographers and reporters into the middle of wildfires to document it all. The concept of multiple, huge wildfires in the city itself, boosted by 100 mile-per-hour winds was beyond anyone’s planning.

Still, even as fire seasons and Santa Ana winds returned every year, even as climate change has changed weather patterns and made fire season a year-long danger, we always thought the epochal event in Southern California would be the unpredicted earthquake that would trigger fissures all along the multiple faults that cut through the city.

Amid the shared shock, no one has been able even to really assess the damages to individual homes, businesses, schools, churches, synagogues libraries, stores and elements that make up community. For once, fire has been a devilish unifier in a city always torn by race and class.

Draw the Lens Back

At some point when the winds abate for firefighting to quash all the flames, the images of damage and emotion will be pushed aside for serious, practical plans about what to do about rebuilding, about fire prevention, about support for home- and business-owners and even the insurance industry that will be overwhelmed by claims of astronomical proportion.

It will be a test for local, state, and federal officials, who have showed a decidedly mixed set of reactions in these first days of total emergency response.  Local and state officials have been overwhelmed with the immediate; President Joe Biden has committed the federal government — and his successor — with providing 100 percent of firefighting costs for the next six months, as well as FEMA support.

But Donald Trump and Republican allies in Congress and right-leaning media have yet to embrace this natural emergency or the ramifications for the tens of thousands left homeless. Shoving climate change aside along with years of drought and the extension of year-round fire dangers, we’re hearing blame for California Democrats about water pressure of all things, a decades-long complaint about whether to dismiss rules about endangered species, and a broadside attack on diversity and inclusion issues among L.A. firefighters. Indeed, there has been widespread criticism about a 2% cut in local fire department spending during a labor negotiation as rationale for there not being enough water pressure available to fight five simultaneous urban wildfires spreading by hurricane-force winds, leaving some hydrants unusable.

Media Matters lays out the Right’s arguments and does some fact-checking on trying to fit the fires into the anti-DEI agenda; Vox tackles Trump’s anti-regulation case over preserving the smelt.

None of the Republicans had named what The Los Angeles Times had found – that the artificial Santa Ynez reservoir in Pacific Palisades, one of a number water sources,  had been emptied and not yet refilled before the fires broke out. It has prompted Gov. Gavin Newsome to order an investigation, though experts said it would not have provided enough water to maintain pressure in such a widespread conflagration. Basically, the systems were overwhelmed by the fires and wind, all worsened by climate changes.

Rather than putting the victims of natural disaster first, Trump already has turned the reaction partisan and personal, complaining that he has been boxed into having to provide federal aid to a state that has not voted for him and that will hamper his intentions to cut federal spending massively. One Republican congressman already has said he will oppose aid to California unless the state adopts whatever Trump asks in policy changes.

From a public policy point of view, this kind of immediate response from the incoming administration and Republican congress members is almost as distressing as the fires themselves. What are we to make of leaders who cannot seem to absorb information about a disaster? How are we to trust to policy that may emerge?

What’s Next?

Just as in areas hit by tornados and hurricanes, there will be an almost-kneejerk desire to rebuild whole areas, however huge in this case.

Availability of home insurance already had emerged as a serious issue in California; the insurance industry has wailed loudly about being overtaxed by claims arising from wildfires and storms intensifying through climate change. It’s a sure bet that insurers will be looking to the government for the equivalent of a bailout or loan guarantees, whatever form it eventually will take.

There will be investigations about firefighting readiness and the availability of water and water pressure for the worst-hit areas. But none of it will explain away the obvious, that fighting multiple urban wildfires in hurricane-force winds in the country’s most populous and spread-out county is way beyond difficult.

The sheer number of evacuations and damage make shelter and food a problem that will go on for months or more will postpone intelligent discussion about what a rebuilding effort even will look like, to say nothing about expense and who will pay. It will be hard enough for families racing from their homes down gridlocked escape routes to persuade insurance companies to pay claims without access to their burned documents.

At the very least, aren’t we Angelos at least due some acknowledgement of the problems at hand without the usual, disdainful, and incorrect politics about DEI and smelts? Are the rest of us not at least owed a real evaluation of where we stand on climate change issues rather than partisan disdain?


CLICK HERE TO DONATE IN SUPPORT OF DCREPORT

The post Heartbreak in L.A. appeared first on DCReport.org.

Blue Origin scrubs first New Glenn launch attempt due to “vehicle subsystem issue”

Blue Origin’s New Glenn rocket stands at Launch Complex 36 at Cape Canaveral Space Force Station prior to the rocket’s inaugural flight on the NG-1 mission. Image: Blue Origin

Update 3:20 a.m. EST (0820 UTC): Blue Origin scrubbed the launch.

Blue Origin is preparing to step into a new chapter of rocketry, by debuting its first orbital class rocket, New Glenn. It will also attempt to recover the first stage booster on landing platform the Atlantic Ocean.

The company owned by Amazon founder Jeff Bezos was targeting the inaugural launch of New Glenn during a three-hour window on Monday, Jan. 13. However, launch teams ran into what they described as a “vehicle subsystem issue” that took longer to potentially resolve than they had time available in the window.

A new launch date was still being determined as of 3:09 a.m. EST (0809 UTC). When it launches, the rocket will liftoff from Launch Complex 36 at Cape Canaveral Space Force Station and fly in a slightly southeasterly trajectory.

During an interview with Aviation Week prior to the start of fueling Sunday night, Bezos reflected on the enormity of the moment calling it “a very big night.”

“We’re ready. We don’t know for sure what’s going to happen. I think trying to land the booster on the first mission is a little crazy of us and it may not work. It’ll certainly be icing on the cake,” Bezos said.

“If it does, I do hope, I think we all hope, that we successfully deploy the Blue Ring Pathfinder into the correct orbit. So you know, that would be success, but we’re also prepared for anything to go wrong,” he added. “If there is an anomaly of any kind, at any stage of the mission, we’ll pick ourselves up and keep going.”

Poor weather conditions in the area of the Atlantic Ocean where the booster, named ‘So You’re Telling Me There’s a Chance,’ prevented launch attempts previously scheduled for Friday and then Sunday morning. However, conditions were markedly calmer heading into the launch attempt on Monday, according to the 45th Weather Squadron.

“High pressure will build across the area today, then a disturbance approaching the region Monday may increase mid-level clouds across the Spaceport as early as Monday morning,” launch weather officers wrote. “This disturbance will generate showers, breezy winds and widespread clouds across the Spaceport late Monday into early Tuesday.”

If Blue Origin is unable to launch on Monday, but hasn’t begun loading propellant onto the rocket, a backup window on Tuesday has a much worse outlook at liftoff. The forecast goes from a 90 percent chance of favorable weather on Monday to just 40 percent favorable on Tuesday, impacted by both cloud coverage and stronger winds at the launch pad.

Meteorologists also expressed additional confidence in the booster recovery area for both the primary and 24-hour backup launch windows.

“For recovery, significant sea heights will lower to around 5-6 feet for the primary window, and lower even more to around 4-5 ft for the backup window,” the forecast stated. “Winds should remain light, making a low risk for offshore landing weather on both primary and backup periods.”

“The riskiest part of the mission is the landing”

While not the primary goal for the NG-1 mission, one of the riskiest parts of the mission will undoubtedly be Blue Origin’s attempt to land its first stage booster, named ‘So You’re Telling Me There’s a Chance,’ on the landing platform, named ‘Jacklyn,’ after Bezos’ mother.

The operation is one that will look reminiscent of SpaceX and its Falcon 9 rockets, which land on either droneships or landing platforms at both Cape Canaveral and Vandenberg Space Force Base.

Speaking with Aviation Week, Blue Origin CEO Dave Limp said the challenge of attempting a landing on the first outing is exacerbated by the known unknowns of a first flight that they can’t test on the ground.

“It’s very hard to simulate the environments, the hypersonic environment as it’s coming back and so, there’s a number events that happen to make that landing successful that we just have to fly to test,” Limp said. “And that’s why it would be icing on the cake if we landed it, but we will learn so much.”

The roughly 57-meter-tall (188 ft) booster was designed to be usable for a minimum of 25 launches, according to Blue Origin. The booster, also referred to as Glenn Stage 1 (GS1) is powered by seven of the company’s BE-4 engines.

GS1 is fueled by liquified natural gas and liquid oxygen. The combination of all seven engines at liftoff is about 3.9 million pounds of thrust.

A little more than three minutes into flight, the booster will aim to separate from the upper stage and use a combination of the forward module fins and the reaction control system to reorient the vehicle to aim for the landing vessel.

A little more than seven minutes into the mission, three of the seven BE-4 engines will reignite to conduct a nearly 30-second reentry burn to slow the booster down. A final landing burn will begin just before the nine-minute mark with a touchdown scheduled for about 9.5 minutes after liftoff.

The aft module of the booster contains six hydraulically-actuated legs, which deploy seconds before a planned landing. Following touchdown, a robot called the Recovery Remotely Operated Vehicle (ROV) is deployed to attach to the booster.

Limp said in a post on X that it “provides power, communication and pneumatic links between the booster and the platform.” He added that the ROV is about 4.3-meters-tall (14 ft) and takes up the footprint of a Ford F-150 truck.

The Recovery Remotely Operated Vehicle (ROV) pictured on Blue Origin’s landing vessel, named ‘Jacklyn,’ after founder Jeff Bezos’ mother. ROV will deploy following a booster landing and provide power, communication and pneumatic links between the booster and Jacklyn, according to Blue Origin CEO Dave Limp. Image: Blue Origin

The landing timeline will only come to pass if everything is nominal with the flight. The booster will divert from the landing vessel, if it senses an anomaly.

Bezos told Aviation Week on Sunday that while he considered the booster landing to be “the riskiest part of the mission,” even if the booster is lost, Blue Origin is already in a good work flow at their manufacturing campus on Merritt Island, just outside of the gates of the Kennedy Space Center.

“We have two boosters right here in workflow, two more boosters. We’ve got, I don’t know, seven or eight second stages right here in workflow,” Bezos explained. “So, we’ll be ready to fly again in the spring, regardless of what happens.”

Setting the table

Besides the landing attempt, the primary goal for Blue Origin is get the New Glenn rocket safely off the pad at LC-36 and have a nominal flight of its second stage, GS2, which is fueled by liquid hydrogen and liquid oxygen.

Tucked inside the 7-meter-diameter (23 ft) payload fairings is the company’s Blue Ring Pathfinder. During the NG-1 mission, it will remain fixed to the upper stage and work to “validate space to ground communications capabilities by sending commands, receiving telemetry, receiving store and compute mission data, and performing radiometric tracking (for navigation).”

The GS2 with the Blue Ring Pathfinder will launch into a highly elliptical orbit in the range of the medium Earth orbit, with an apogee of 19,300 km and a perigee of 2,400 km at a 30 degree inclination.

The NG-1 mission serves as a way for Blue Origin to learn much more about it upper stage. Bezos described second stage ignition as just one of the big hurdles during this inaugural flight.

“Because you’re in vacuum, it’s not easy for an engine the size of BE-3U to do vacuum testing at full power, so ignition is a real issue,” Bezos said. “Even fairing separation has caught people up. Even stage separation has caught people up. Stage separation is another thing that you can’t really test on Earth. You can do certain subsystem tests and so on, but of all the things we’re doing today, relighting the BE-4s in that reentry environment, that’s probably the hardest thing to test.”

Bezos said the path to profitability will depend partly on the flight tonight and partly on how quickly they’re able to get back to the launch pad.

“I think we can fly six to eight times this year and hopefully ramp up very quickly in 2026 after that,” Bezos said. “But I don’t want to speculate on when that would actually become profitable.”

Sunday Night Futures

Weekend:
Schedule for Week of January 12, 2025

Monday:
• No major economic releases scheduled.

From CNBC: Pre-Market Data and Bloomberg futures S&P 500 are unchanged and DOW futures are up 68 (fair value).

Oil prices were up over the last week with WTI futures at $76.57 per barrel and Brent at $79.76 per barrel. A year ago, WTI was at $73, and Brent was at $80 - so WTI oil prices are up about 5% year-over-year.

Here is a graph from Gasbuddy.com for nationwide gasoline prices. Nationally prices are at $3.06 per gallon. A year ago, prices were at $3.06 per gallon, so gasoline prices are unchanged year-over-year.

Why is Polaris called the North Star? Why is Polaris called the North Star?


Moonraker revisited

Moonraker is not remembered as one of the great James Bond films, but its space theme is still warmly recalled by some fans. Dwayne Day describes how new products about the film have highlighted its strengths.

Review: Star Bound

Summarizing the history of American spaceflight in one book requires hard choices on what to emphasize. Jeff Foust reviews a book that tackles that effort at an introductory level, going from Goddard to the present day.

The (not quite) definitive guide to the legal construct of "space resources"

The ability to own space resources has been a long-running debate in space law. Michael Listner examines the legal concept of space resources at the national and international level.

Planning for space rescue

NASA has bristled at the suggestion that astronauts Suni Williams and Butch Wilmore are "stranded" on the ISS even as their stay there is extended from a few weeks to more than eight months. Jeff Foust reports that the situation nonetheless highlights the importance some see in developing technologies and approaches when a real space rescue is needed.