How did the CEO of an online payments firm become the nominee to lead NASA?

President-elect Donald Trump announced Wednesday his intent to nominate entrepreneur and commercial astronaut Jared Isaacman as the next administrator of NASA.

For those unfamiliar with Isaacman, who at just 16 years old founded a payment processing company in his parents' basement that ultimately became a major player in online payments, it may seem an odd choice. However, those inside the space community welcomed the news, with figures across the political spectrum hailing Isaacman's nomination variously as "terrific," "ideal," and "inspiring."

This statement from Isaac Arthur, president of the National Space Society, is characteristic of the response: "Jared is a remarkable individual and a perfect pick for NASA Administrator. He brings a wealth of experience in entrepreneurial enterprise as well as unique knowledge in working with both NASA and SpaceX, a perfect combination as we enter a new era of increased cooperation between NASA and commercial spaceflight."

Read full article

Comments

Friday: Employment Report

Mortgage Rates Note: Mortgage rates are from MortgageNewsDaily.com and are for top tier scenarios.

Friday:
• At 8:30 AM ET, Employment Report for November.   The consensus is for 183,000 jobs added, and for the unemployment rate to be unchanged at 4.1%.

• At 10:00 AM, University of Michigan's Consumer sentiment index (Preliminary for December).

Thursday 5 December 1661

This morning I went early to the Paynter’s and there sat for my picture the fourth time, but it do not yet please me, which do much trouble me. Thence to the Treasury Office, where I found Sir W. Batten come before me, and there we sat to pay off the St. George. By and by came Sir W. Pen, and he and I staid while Sir W. Batten went home to dinner, and then he came again, and Sir W. Pen and I went and dined at my house, and had two mince pies sent thither by our order from the messenger Slater, that had dressed some victuals for us, and so we were very merry, and after dinner rode out in his coach, he to Whitehall, and my wife and I to the Opera, and saw “Hamlett” well performed. Thence to the Temple and Mrs. Turner’s (who continues still very ill), and so home and to bed.

Read the annotations

Measuring Lines/Function

Measuring the lines per function for the recent chapter on Pareto distributions in software turned out to be trickier than expected. I’ve done similar enough times that I figured I’d document it for you all as you look for your own distributions. (And I hope you do go look for your own distributions.)

If I’ve made gross mistakes in what follows or there …

Read more

At least five interesting things: Nerdy economics edition (#54)

Ostensibly this is an economics blog. I know I’ve spent a lot of time recently ranting about land acknowledgements, or war, or the problems with the Democratic party, but really the point of all that is to get us back to a world that’s calm and rational enough where we can afford to spend our time thinking about wonky nerdy econ stuff. Right?

In the meantime, though, there’s always plenty of research to nerd out about. So that’s the focus of this week’s roundup.

But first, podcasts! Here’s a debate I did with Vitalik Buterin, creator of Ethereum, on the Bankless podcast a few months ago, which we just republished on Econ 102. We’re debating whether new technologies tilt the playing field toward authoritarian governments:

Vitalik never fails to have original and incisive thoughts on this and other topics.

Also, here’s this week’s Econ 102 episode, where Erik and I follow up on my post about land acknowledgements with a discussion about immigration and national identity:

Anyway, on to this week’s list of interesting things!

1. The coolest tax idea you’ve never heard of (or maybe you have?)

When Trump did his tax reform in his first term, his team initially had a big idea that I really liked. It’s called the Destination-Based Cash Flow Tax, or DBCFT. It’s a way of reforming corporate taxation to promote investment and exports. Alan Auerbach had a good simple explainer of the DBCFT in 2017. Here’s his summary of what the tax entails:

[T]he DBCFT would replace the [corporate] income tax with a cash-flow tax, substituting depreciation allowances with immediate investment expensing and eliminating interest deduction for nonfinancial companies. On the international side, the DBCFT would replace the current “worldwide” tax system, under which US activities of US and foreign businesses and foreign activities of US businesses are subject to US taxation, with a territorial system that taxes only US activities plus border adjustment that effectively denies a tax deduction for imported inputs and relieves export receipts from tax.

Auerbach also explains that a DBCFT is mathematically equivalent to two other policies: a value-added tax (VAT) plus a wage subsidy.

I really liked the DBCFT, but sadly Trump chose to back off of most of its provisions. But in a recent post over at Cremieux’s blog, Jason Harrison argues that Trump should resurrect the idea:

Cremieux Recueil
Trump Should Finish What He Started
Introduction…
Read more

Harrison explains several advantages of the DBCFT relative to our current method of corporate taxation:

  • It encourages companies to invest more, by allowing them to expense their investments immediately. (Note that immediate full expensing for R&D spending is especially important for growth.)

  • It stops encouraging companies to borrow too much money, as our current system does.

  • It makes it harder for companies to evade taxes by shifting profits overseas.

Harrison points out that the DBCFT is a form of consumption tax. Usually we think consumption taxes are more economically efficient than income taxes, because they don’t discourage savings and investment. The reason we often shy away from them is because consumption taxes are regressive — poor people consume most of their income, while rich people save most of theirs, so shifting from income to consumption taxes will hit the poor harder while exempting the rich.

But because DBCFT is equal to a consumption tax (i.e. a VAT) plus a wage subsidy, it doesn’t have this problem. It only taxes the part of consumption that comes from capital income, leaving consumption from labor income alone. Harrison explains:

[W]hile a VAT taxes all consumption, the DBCFT only taxes consumption financed from non-wage sources—mainly existing wealth (wealth accumulated before the reform that has already faced the income tax) and above-normal returns to investment.

One additional benefit of the DBCFT that Harrison doesn’t mention is that it promotes exports. If, like me, you’re a believer that export promotion is a key piece of industrial policy, then you should like the DBCFT.

So I strongly agree that Trump should bring back the DBCFT in his second term.

2. Construction productivity and regulation

Over the past few years, Americans have noticed that their country seems incapable of building much of anything. There are lots of facts that fit this general story. Two very important ones are:

  1. Productivity in the construction sector has flatlined or even decreased since the mid-1960s.

  2. Land-use restrictions are an important barrier to getting things built.

Here’s a picture of decreasing productivity in construction, via Goolsbee and Syverson (2023):

Intuition says that this is somehow connected to the land-use restrictions that we know are blocking development. Some authors, like Brooks and Liscow (2019), have pointed out that the timing for this explanation lines up very well. But there’s the question of how, exactly, land-use restrictions make the construction industry less productive. After all, once production gets greenlit, don’t the restrictions no longer matter? What’s making construction less productive even after it gets approved?

One possibility is that NIMBY legal challenges cause delays, which cause cost overruns. But in a new paper, D’Amico et al. (2024) argue that there’s a deeper force at work here. They hypothesize that land-use restrictions force construction companies to be too small and fragmented:

Homes built per construction worker remained stagnant between 1900 and 1940, boomed after World War II, and then plummeted after 1970. The productivity boom from 1940 to 1970 shows that nothing makes technological progress inherently impossible in construction. What stopped it? We present a model in which local land-use controls limit the size of building projects. This constraint reduces the equilibrium size of construction companies, reducing both scale economies and incentives to invest in innovation. Our model shows that, in a competitive industry, such inefficient reductions in firm size and technology investment are a distinctive consequence of restrictive project regulation, while classic regulatory barriers to entry increase firm size. The model is consistent with an extensive series of key facts about the nature of the construction sector. The post-1970 productivity decline coincides with increases in our best proxies for land-use regulation. The size of development projects is small today and has declined over time. The size of construction firms is also quite small, especially relative to other goods-producing firms, and smaller builders are less productive. Areas with stricter land use regulation have particularly small and unproductive construction establishments. Patenting activity in construction stagnated and diverged from other sectors. A back-of-the-envelope calculation indicates that, if half of the observed link between establishment size and productivity is causal, America’s residential construction firms would be approximately 60 percent more productive if their size distribution matched that of manufacturing.

This theory is cool because it connects both low construction productivity and land-use restriction with another important and well-known fact — the fragmentation of the construction industry. Construction companies are tiny compared to companies in other industries:

I love this model, and I think that the basic story is probably true. But I think there’s at one element that might be unrealistic. D’Amico et al. model land-use regulations as being fundamentally about construction project size — it’s harder to get larger projects approved, so construction companies have to focus on smaller ones. While that’s certainly true, I strongly suspect that there’s a much more important mechanism by which land-use restrictions limit economies of scale: regulatory heterogeneity.

Land-use restrictions are very different in different cities, and navigating the approval process in Baltimore doesn’t really make you better at navigating the approval process in San Diego. This means that it’s very hard to have a few giant Wal-Mart style construction companies that handle projects all over the country. I think if we’re looking for ways to increase construction productivity, harmonizing regulations across regions is probably more promising than simply approving more megaprojects.

Also, D’Amico et al. limit their empirical analysis to America. Why has construction productivity stagnated in other countries around the world, including countries like Japan that have very permissive land-use regulations? In general, I’m suspicious of single-country explanations for global phenomena. I think D’Amico et al. have identified an important factor, but I have a feeling there’s a lot more to this puzzle.

3. Are humans irrational or just confused?

When economists started making theories of human behavior, they generally assumed that everyone was “rational” — that they made decisions based on assessing all the available information in an optimal way, and maximizing their utility accordingly. Then in the 70s and 80s, some psychologists and behavioral economists did a bunch of experiments that showed people behaving in seemingly irrational ways. Often these experiments involved having people choose between “lotteries” — i.e., presenting them with two possible gambles, and letting them choose which one they’d rather take.

Behavioral economists found that people systematically made certain kinds of choices among these “lotteries” that didn’t seem to fit the textbook definition of rationality. One of the most popular explanations for these “anomalies” was something called Prospect Theory, created by Daniel Kahneman and Amos Tversky. It’s basically two theories in one. Part of Prospect Theory says that people exaggerate small probabilities in their decision-making. The other part says that people are loss-averse — their utility depends on a seemingly arbitrary reference point.

But a new paper by Ryan Oprea challenges the idea that we even need something like Prospect Theory at all. Oprea hypothesizes that a lot of the seemingly “irrational” experimental behaviors are really just due to the excessive complexity of the task they’re being asked to do. He does an experiment where he takes away all the risk in the decision — there are no probabilities and no losses involved. One option just gives you more money than the other. And yet experimental subjects still make mistakes that look a lot like the “irrational” choices they make in Kahneman-type experiments. Eric Crampton has a good blog post summarizing the details of Oprea’s experiment.

So it’s possible that a lot of what looks like “irrationality” is just human beings being unable to deal with complex calculations. That doesn’t kill the idea of behavioral economics — it just means we need different theories about why people don’t act like homo economicus.

As an example, Ben Moll has a new paper in which he challenges the use of rational-expectations heterogeneous-agent models in macroeconomics. He argues that these theories would require consumers to take way too much information into account in their calculations. Instead, he suggests that we need theories in which agents solve simpler problems, even if this results in a little bit of what seems like irrationality:

The thesis of this essay is that, in heterogeneous agent macroeconomics, the assumption of rational expectations about equilibrium prices is unrealistic, unnecessarily complicates computations, and should be replaced. This is because rational expectations imply that decision makers (unrealistically) forecast equilibrium prices like interest rates by forecasting cross-sectional distributions. The result is an extreme version of the curse of dimensionality: dynamic programming problems in which the entire cross-sectional distribution is a state variable (“Master equation” a.k.a. “Monster equation”). This problem severely limits the applicability of the heterogeneous-agent approach to some of the biggest questions in macroeconomics, namely those in which aggregate risk and non-linearities are key, like financial crises…I then discuss some potentially promising directions, including temporary equilibrium approaches, incorporating survey expectations, least-squares learning, and reinforcement learning.

That sounds right to me. I never understood why macroeconomics should assume that human agents are infinitely powerful calculating machines. It’s good to see people reexamining that approach.

4. America’s economic fundamentals are strong

The U.S. economy has been growing strongly in recent years. Part of that was just a function of putting people back to work after the shock of the pandemic. But a lot of it was due to productivity growth. American workers are producing a lot more in terms of output per hour than they did in 2016 or 2022. Joey Politano has the story:

Apricitas Economics
America's Productivity Boom
Thanks for reading! If you haven’t subscribed, please click the button below…
Read more

In fact, the U.S. is almost unique among rich countries in terms of seeing its productivity grow since the pandemic! Everyone else is stagnating:

Productivity has grown in service industries while stagnating or shrinking in manufacturing — a change from the traditional pattern, and a big challenge to the conservative idea that government red tape is holding service industries back.

Politano attributes America’s productivity boom to three factors:

  • Increased capital intensity (companies investing more)

  • Increased reallocation of workers to new and better jobs

  • Increased creation of new companies since the pandemic

Note that the first of these challenges the progressive notion that companies have just been buying back their stock instead of investing.

In any case, this is great news. Productivity is the bedrock of economic performance — it drives long-term increases in wages and living standards, and it allows the Fed to keep interest rates lower without risking inflation. Something is going very right in the American economy that isn’t going right in other rich countries (or, at least since 2022, in China either).

5. Yes, some of the post-pandemic inflation was probably demand-driven

Everyone pretty much agrees that anger over inflation was one of the reasons Trump won the election this year. But some of the folks I call “macroprogressives” — people who tend to favor more fiscal stimulus in nearly any situation — continue to insist that inflation was mostly just a function of supply disruptions rather than an effect of Biden’s American Rescue Plan or Trump’s CARES Act. For example, here are Zachary D. Carter, Joe Weisenthal, and Kasey Klimes, citing analyses from the Wall Street Journal, Jan Hatzius, and Peter Orszag:

Source: Peter Orszag

But with all due respect to Orszag, Hatzius, and the rest, I don’t think this case holds up.

First of all, I’ve looked at a number of analyses of the post-pandemic inflation, and most of them ascribe a significant fraction — usually around half — to demand-side factors, especially in 2021. For example, this is from Adam Shapiro at the San Francisco Fed:

And this is from Òscar Jordà, Celeste Liu, Fernanda Nechio, and Fabián Rivera-Reyes, also at the San Francisco Fed:

And this is from Matthew Gordon and Todd Clark of the Cleveland Fed:

And here’s Giovanni et al. (2024), using a pretty standard model and cross-country evidence, and concluding that both demand and supply shocks contributed:

We employ a multi-country multi-sector New Keynesian model to analyze the factors driving pandemic-era inflation. The model incorporates both sector-specific and aggregate shocks, which propagate through the global trade and production network and generate demand and supply imbalances, leading to inflation and spillovers. The baseline quantitative exercise matches changes in aggregate and sectoral prices and wages for a sample of countries including the United States, Euro Area, China, and Russia. Our findings indicate that supply-chain bottlenecks ignited inflation in 2020, followed by a surge in prices driven by aggregate demand shocks from 2021 through 2022, exacerbated by rising energy prices.

Even more damningly for “team supply-side factors”, Olivier Blanchard, using a super-simple New Keynesian type model, was able to correctly predict inflation in advance, just by looking at the size of Biden’s Covid relief bill in February 2021!

Meanwhile, Orszag’s chart of Covid stimulus vs. inflation includes a lot of developing countries, where inflation can often be very high due to a variety of factors that don’t typically crop up in rich nations. When Barro and Bianchi (2023) look only at OECD countries, they get a positive correlation:

And when Barro and Bianchi use an alternative measure of government spending consistent with the Fiscal Theory of the Price Level — the theory that says that government deficits cause inflation — they find an even tighter correlation:

I’m not sure whether limiting the analysis to OECD countries is the right thing to do, or how much we should trust in the Fiscal Theory of the Price Level. But this certainly shows that Orszag’s analysis is no easy slam dunk. Meanwhile, the plethora of studies finding a substantial role for stimulus in the inflation of 2021-22, and Blanchard’s successful ex ante prediction of inflation, are hard to ignore here.

I think it makes sense to conclude that Covid relief spending was one significant cause of inflation. Whether that was worth it or not is an entirely different question.

6. Am I wrong about national health insurance?

Via Marginal Revolution, I have found a paper that somewhat challenges my priors on the topic of national health insurance.

I’ve long been a proponent of national health insurance — not the “pay for everything and outlaw private insurance” thing that Bernie Sanders wants, but the more sensible Japan/Korea system, where the government pays 70% of everything and leaves the rest to the private sector. My reasoning is basically that there’s a lot of monopoly power pushing up prices in the health industry, and a government insurer that covered part of every procedure would have countervailing monopsony power that would be able to squeeze these excess costs out of the system. Basically, we already do a lot of this with Medicare, and it does result in lower prices:

Source: Noah Smith

Conservatives have often countered that this type of system would squelch medical innovation. I typically downplay such concerns, arguing that Japan and Korea exhibit pretty good levels of innovation. But a new paper by Yunan Ji and Parker Rogers is making me wonder if the conservatives’ concerns are more justified than I had realized. The authors look at what happens in various medical product categories when Medicare forces providers to cut prices. They find some pretty negative effects on innovation:

We investigate the effects of substantial Medicare price reductions in the medical device industry, which amounted to a 61% decrease over 10 years for certain device types. Analyzing over 20 years of administrative and proprietary data, we find these price cuts led to a 29% decline in new product introductions and an 80% decrease in patent filings, indicating significant reductions in innovation activity. Manufacturers reduced market entry and relied more heavily on outsourcing to other producers, which was associated with higher rates of product defects. Our calculations suggest the value of lost innovation may fully offset the direct cost savings from the price cuts. We propose that better-targeted pricing reforms could mitigate these negative effects. These findings underscore the need to balance cost containment with incentives for innovation and quality in policy design.

This is certainly a sobering result for proponents of national health insurance. But I’m not sure Ji and Rogers actually contradict my priors very strongly here. They argue that if Medicare only used its price-cutting power on products with very high profit margins, it could cut costs without hurting innovation:

In a well-targeted price reform, we would expect Medicare to prioritize product categories with the highest profit margins…[T]his analysis highlights that a more targeted approach could potentially reduce Medicare spending while minimizing negative impacts on medical innovation.

And this makes perfect sense with the theory of monopoly and monopsony. The monopsony power of a national health insurance system is a powerful and dangerous weapon — it needs to only be used in specific instances when there’s significant monopoly power that needs to be canceled out. You can’t just go clobbering every single product with price controls.

As always with activist government policies, the devil is in the implementation.


Subscribe now

Share

What’s Good for the Goose, AI Training Edition

Stephanie Palazzolo, writing for The Information (paywalled, alas):

Researchers at OpenAI believe that some rival AI developers are training their reasoning models by using OpenAI’s o1 reasoning models to generate training data, according to a person who has spoken to the company’s researchers about it. In short, the rivals can ask the o1 models to solve various problems and then use the models’ chain of thought — the “thought process” the models use to solve those problems — as training data, the person said.

You might be wondering how rival developers can do that. OpenAI has explicitly said it hides its reasoning models’ raw chains of thought due in part to competitive concerns.

But in answering questions, o1 models include a summarized version of the chain of thought to help the customer understand how the models arrived at the answer. Rivals can simply ask another LLM to take that summarized chain of thought and predict what the raw chain of thought might have been, the person who spoke with the researchers said.

And I’m sure these OpenAI researchers are happy to provide this training data to competitors, without having granted permission, in the same way they trained (and continue to train) their own models on publicly available web pages, without having been granted permission. Right?

 ★ 

From the Department of Bringing Receipts to the Interview

From The Stanford Review editor-in-chief Julia Steinberg’s interview with university president Jonathan Levin:

Stanford Review: What is the most important problem in the world right now?

President Levin: There’s no answer to that question. There are too many important problems to give you a single answer.

Stanford Review: That is an application question that we have to answer to apply here.

(Via CJ Ciaramella on Bluesky.)

 ★ 

Jeff Bezos on Trump’s Second Term: ‘I’m Actually Very Optimistic This Time Around’

Alex Heath, writing at The Verge:

“I’m actually very optimistic this time around,” Bezos said of Trump during a rare public appearance at The New York Times DealBook Summit on Wednesday. “He seems to have a lot of energy around reducing regulation. If I can help him do that, I’m going to help him.”

Trump railed against Bezos and his companies — Amazon, Blue Origin, and The Washington Post — during his 2016 term. Bezos defended himself but it did little to help his reputation with Trump. Now, his companies have a lot at stake in the coming administration, from the FTC’s antitrust lawsuit against Amazon to Blue Origin’s efforts to compete with SpaceX for government contracts.

Onstage at the DealBook Summit on Wednesday, Bezos called Trump “calmer this time” and “more settled.” He said he will try to “talk him out of” the idea that the press, which includes The Washington Post, is an enemy of the people.

“You’ve probably grown in the last eight years,” he said to DealBook’s Andrew Ross Sorkin. “He has, too.”

Next up after Bezos at DealBook Summit was Charlie Brown, who professed optimism regarding his next attempt at kicking a football held by Lucy Van Pelt. What the fuck did they put in the water at this conference?

Or, perhaps, these very smart guys are also craven, and these nonsensical remarks, which are quite obviously contrary to reality, are simply additional exhibits of shameful cowardly compliance.

 ★ 

Shame on Google for Their Description of Google Messages’s Encryption Support

While writing the previous item regarding the FBI encouraging the use of E2EE text and call protocols, I wound up at the Play Store page for Google Messages. It’s shamefully misleading regarding Google Messages’s support for end-to-end encryption. As I wrote in the previous post, Google Messages does support E2EE, but only over RCS and only if all participants in the chat are using a recent version of Google Messages. But the second screenshot in the Play Store listing flatly declares “Conversations are end-to-end encrypted”, full stop. That is some serious bullshit.

I realize that “Some conversations are end-to-end encrypted” will naturally spur curiosity regarding which conversations are encrypted and which aren’t, but that’s the truth. And users of the app should be aware of that. “RCS conversations with other Google Messages users are encrypted” would work.

Then, in the “report card” section of the listing, it states the following:

Data is encrypted in transit
Your data is transferred over a secure connection

Which, again, is only true sometimes. It’s downright fraudulent to describe Google Messages’s transit security this way. Imagine a typical Android user without technical expertise who takes the advice (now coming from the FBI) to use end-to-end encryption for their messaging. A reasonable person who trusts Google would look at Google’s own description of Google Messages and conclude that if you use Google Messages, all your messages will be secure. That’s false. And depending who you communicate with — iPhone users, Android users with old devices, Android users who use other text messaging apps — it’s quite likely most of your messages won’t be secure.

Just be honest! The E2EE between Google Messages users using Android phones that support RCS is completely seamless and automatic (I just tried it myself using my Android burner), but E2EE is never available for SMS, and never available if a participant in the chat is using any RCS client (on Android or Apple Messages) other than Google Messages. That’s an essential distinction that should be made clear, not obfuscated.

While I’m at it, it’s also embarrassing that Google Voice has no support for RCS at all. It’s Google’s own app and service, and Google has been the world’s most vocal proponent of RCS messaging.

Lastly, I also think it’s a bad idea that Google Messages colors all RCS message bubbles with the exact same colors (dark blue bubbles with white text, natch). SMS messages, at least on my Pixel 4, are pale blue with black text. Google Messages does put a tiny lock in the timeline to indicate when an RCS chat is secure, and they also put a lock badge on the Send button’s paper airplane icon, so there are visual indications whether an RCS chat is encrypted, but because the messages bubble colors are the same for all RCS chats, it’s subtle, not instantly obvious like it is with Apple Messages, where green means “SMS or RCS, never encrypted” and blue means “iMessage, always encrypted”.

 ★ 

What’s Happening at MSNBC?

attends the Daily Front Row's Fashion Media Awards at Four Seasons Hotel New York Downtown on September 8, 2017 in New York City.

David Frum just shared a disturbing anecdote from an appearance this morning on MSNBC’s Morning Joe. According to his short article at The Atlantic, he made a flippant reference to reporting that Pentagon nominee Pete Hegseth was known for drinking on the job at Fox News. The specific line was: “If you’re too drunk for Fox News, you’re very, very drunk indeed.”

He went on to compare the case to that of John Tower in 1989, a long-serving senator whose Secretary of Defense nomination (Dick Cheney got the nod after Tower bowed out) was torpedoed over claims of drinking and womanizing. According to David, after he said this, an MSNBC producer piped up in his ear objecting to his comments and warning him not to repeat them. Not long after, David was ushered off the set, apparently sooner than expected. Then Mika Brzezinski read out an apology for what he’d said.

Bizarre.

Let me share a few thoughts about this.

Everybody is already bummed at Mika and Joe over their seemingly contrite pilgrimage to Mar-a-Lago, and very understandably so. But this sounds like something in a different category. It sounds like something coming down from corporate at MSNBC.

Let’s focus specifically on what David said. He was talking about drinking.

Defamation law is a curious thing. The ins and outs of it aren’t as predictable as you might think. There is pretty clear case law, for instance, which holds that calling someone a “Nazi” cannot be defamatory. It’s like calling someone a “big dummy head.” It’s just an opinion. By definition it can’t be defamatory.

Needless to say, I’m not a lawyer. And I’m definitely not YOUR lawyer. But I’m not pulling this stuff out of my hat. In my job I’ve had to work closely with very experienced First Amendment lawyers for many years. Accusing someone of being a drunk isn’t just different in the sense that it is a factual issue — it’s true or it’s not. It’s also something that can be professionally damaging. That elevates it in terms of reputational damage, which is what defamation and libel law are about. Someone with a reputation as an alcoholic might easily not get hired for jobs because they’re viewed as unreliable. It’s not just a matter of hurt feelings. The potential damage is tangible, even quantifiable.

The point is that defamation law isn’t always linear and commonsensical. Some things you’d think would be no-nos are fine and others that seem like locker room banter can be big no-nos.

Needless to say, under Sullivan this shouldn’t matter. Hegseth is a textbook public figure. The speech is in a clearly political context in which the First Amendment protections are strongest. And there’s lots of reporting on which David could base that remark.

The point of going into all of this is that Trump specifically and the MAGA world generally has been putting everyone on notice for years that they’re going to flood the zone with lawsuits. So watch out, basically. And now with Trump coming back in, the assumption is that the threat jumps up dramatically.

So on first blush, this seems like hyper-caution over potential lawsuits. But there are a couple problems with that theory. The first is that MSNBC — or its now spun-off parent company — aren’t some tiny operation that could be sunk by a lawsuit. Perhaps Sullivan isn’t long for this world. But for now it’s the law. And it should make any potential suit manageable for a company of that size. But then there’s also the specific apology from Brzezinski. It seemed to be directed not at Hegseth but rather at Fox News. Here’s the relevant part.

The comment was a little too flippant for this moment that we’re in. We just want to make that comment as well. We want to make that clear. We have differences in coverage with Fox News, and that’s a good debate that we should have often, but right now I just want to say there’s a lot of good people who work at Fox News who care about Pete Hegseth, and we will want to leave it at that.

Is Fox going to sue the show? Are they going to get into a morning ratings war with them? It’s weird isn’t it? You would have expected some comment like saying “we don’t know whether these allegations are true,” etc. But unless I’m missing something, this seems like wanting to keep the peace with Fox News — “a lot of good people who work at Fox News who care about Pete Hegseth.”

I can’t really decode that at all. Not legally, or politically or journalistically.

The final thought I had is this. I don’t watch a lot of political TV. In fact, I’ve been laid up for the last three days with a bad cold. (Almost all better, thanks for asking.) So I haven’t seen any other MSNBC shows. But it’s hard for me to imagine that the allegations about Hegseth’s drinking haven’t come up in the 24/7 talk at MSNBC and often in at least somewhat flippant ways.

Have all the guests there gotten the same message but not said anything?

Is this really just something going on at Morning Joe?

Festivitas — Holiday Lights for Your Mac Menu Bar and Dock

Purely fun, pay-whatever-you-think-fair app for the Mac from Simon Støvring (developer of numerous fine apps such as Runestone and Scriptable):

Festivitas automatically adds festive lights to your menu bar and dock upon launch and you can tweak their appearance to match your preferences.

There is something very core to the Mac’s origins about not just making a software toy like this, but putting effort into making everything about it really nice. Harks back to Steven Halls’s The Talking Moose and, of course, the undisputed king of the genre, Eric Shapiro’s The Grouch. Oh, and as Stephen Hackett reminded me, Holiday Lights.

 ★ 

With Pete Hegseth Among the Post-Nominated

It seems all but certain that Pete Hesgeth’s nomination to lead the Pentagon is doomed. Yesterday he was reduced to promising not to drink on the job if he’s confirmed for Defense Secretary. You may not like him, but don’t deny him this: he’s going to have the best story ever when he introduces himself at his first meeting and explains what brought him to AA. It’s probably best to refer to Hegseth on Thursday afternoon as one of the “post-nominated.” Trump is already sounding out Ron DeSantis for the job. But he’s happy to let Hesgeth twist in the wind a bit longer. And in a paradoxical kind of way I appreciate his doing that. This of course will be Trump’s second top-tier nominee to go down in flames, and the third overall.

Has this gone well for Hesgeth? I don’t mean in terms of getting the job. I mean in the general sense of reputation, dignity, etc. I’d say it’s gone … well, pretty badly? Kind of the fate of everyone and everything who locks up with Trump.

DeSantis is much like Marco Rubio, a generally clownish figure, if somewhat more malevolent, but in the overall ballpark of the kinds of people who get these jobs. He’s served in Congress. He’s been governor of the one the country’s most populous states. Given the type of people Trump often hires for these jobs, the country could do so much worse.

So does it matter that Hesgeth goes down the tubes?

It does.

All political power is unitary. A president isn’t weak domestically but powerful on foreign policy — powerful on health care policy but hanging by a thread on interest rates. It’s all of a piece. The damage a president takes anywhere affects him or her everywhere. So having these absurd nominations go down in flames actually does matter. It’s not just the same as if Trump had nominated DeSantis or Pam Bondi in the first place.

That brings us to a broader point. If the political opposition is most worried about what a President will do on issue X, that doesn’t mean the opposition should necessarily focus its attacks on issue X. They may ignore issue X entirely. Maybe issue X is actually popular. Maybe nobody cares about issue X. So no one will pay attention. An opposition will focus its attacks on the President’s most vulnerable points because that is where his or her power can be reduced most effectively. And all political power is unitary.

It’s mostly a fool’s game trying to figure out just what Trump was trying to achieve nominating this group of clowns for most of the top Cabinet positions. Simple loyalty was a big factor, people who won’t flinch from doing whatever Trump says. They’re also all good on TV, or, at least, what Trump thinks is good on TV. But really it was a power play. It’s Caligula appointing his horse to the Senate. The absurdity is the point. I can do anything. Make the Republican Senate line up and approve a roster of manifestly unqualified nominees. But they’re going down one after another.

They’re doing it in a particular GOP senator way — all through winks and shadows, pregnant sighs. As far as I know, no Republican senator said they wouldn’t vote for Matt Gaetz, just as none has said so about Hegseth. On the pod Kate and I recorded this afternoon, we noted that if this were Biden’s or Harris’ transition, watching the top nominees go down in flames would be treated like the presidency itself was DOA. But not having a fancy Times or Politico columnist say it doesn’t make it any less so. Trump’s ability to just dictate isn’t quite panning out. And that matters.

Links 12/5/24

Links for you. Science:

Bumblebee population increases 116 times over in ‘remarkable’ Scotland rewilding project
150million people born before 1996 at risk of a mental breakdown caused by metal exposure
Why ‘open’ AI systems are actually closed, and why this matters
Trump’s former FDA commissioner warns RFK Jr. could ‘cost lives’ if confirmed
How steam from a Wisconsin factory fueled a 100-mile band of snow
Orcas have learned how to hunt and kill huge whale sharks, video shows

Other:

If Anyone Can Save the Democrats, It’s Ben Wikler
Breaking the norms of death: A healthcare CEO was shot dead. Many are glad. What now?
What happened to Intel? Intel has ejected its CEO. There are many possible reasons why.
The Sound of Fear on Air: It is an ominous sign that Morning Joe felt it had to apologize for something I said. (just as we need to know much more about editors at newspapers–who they are to start with–we need to know much more about news TV producers)
People’s Reactions to a Health Insurance CEO Getting Assassinated Are Incredibly Dark
Democrats flip House seat, leaving GOP no votes to spare
European Federation of Journalists to stop posting content on X
With pardon of son, Biden shows a way forward for Democrats
Nadeau’s Push to Regulate Delivery Drivers Draws Fierce Pushback from Gig Work Companies and Their Lobbyists
Some in the U.S. farm industry are alarmed by Trump’s embrace of RFK Jr. and tariffs (leopards eating faces etc.)
Trump Picks Cryptocurrency Advocate Paul Atkins For SEC Chair
Donald Trump is ready to make Republicans touch the third rail. Without a voting public to face again, Trump is gearing up to cut Social Security and Medicare
The Pardon
Tesla suspends Cybertruck production. Who could have predicted this?
House Republican Wants Party To Boldly Own Plans To Gut The Social Safety Net
Number of Indictments and Convictions of Biden White House Appointees: Zero
FBI Warns Americans to Start Using Encrypted Messaging Apps. It’s all about protecting against China, but there’s the added benefit of protecting against Trump.
Trump aide Monica Crowley plagiarized thousands of words in Ph.D. dissertation (she’s being considered for a new job…)
The American People Deserve DOGE. America has an efficiency problem, but Elon Musk is not the man to fix it.
Trump’s nominee for secretary of defense should alarm all of us
Nancy Mace’s never-ending chase for 15 more minutes of fame
HIV prevention pills should be free, but insurers are still charging
Cryptic three words carved into bullets used to kill UnitedHealthcare CEO Brian Thompson revealed
Biden Owes The Country More Than One YOLO Power Play
Democrats Are Leaving X. But X Left Them First. Elon Musk fundamentally changed the terms of the platform.
Get Off the Floor and Keep It Simple

November Employment Preview

On Friday at 8:30 AM ET, the BLS will release the employment report for November. The consensus is for 183,000 jobs added, and for the unemployment rate to be unchanged at 4.1%.

There were 12,000 jobs added in October, and the unemployment rate was at 4.1%.

From Goldman Sachs:
We estimate nonfarm payrolls rose by 235k in November, above consensus of +215k ... the end of strikes and the recent hurricanes that weighed on October job growth will likely boost November job growth. We estimate that the unemployment rate was unchanged at 4.1%, in line with consensus.
emphasis added
ADP Report: The ADP employment report showed 146,000 private sector jobs were added in November.  This was below consensus forecasts and suggests job gains below consensus expectations, however, in general, ADP hasn't been very useful in forecasting the BLS report (this also doesn't include the boost from the end of Boeing strike and bounce back from the hurricane impact in October).

ISM Surveys: Note that the ISM indexes are diffusion indexes based on the number of firms hiring (not the number of hires).  The ISM® manufacturing employment index increased to was at 48.1%, up from 44.4%.   This would suggest about 30,000 jobs lost in manufacturing. The ADP report indicated 26,000 manufacturing jobs lost in November.

The ISM® services employment index decreased to 51.5% from 53.0%. This would suggest 115,000 jobs added in the service sector. Combined this suggests 85,000 jobs added, far below consensus expectations.  (Note: The ISM surveys have been way off recently)

Unemployment Claims: The weekly claims report showed more initial unemployment claims during the reference week at 215,000 in November compared to 242,000 in October.  This suggests fewer layoffs in November compared to October.

Strikes: The CES strike report shows almost 40,000 employees returned from strikes during the reference period in November. This will boost the headline jobs number.

Conclusion: Employment was impacted by strikes and hurricanes in October.  There should be a bounce back in November.  In the four months prior to October, employment gains averaged 140 thousand.  Adding close to 40 thousand for the strikes, and maybe 50 thousand workers returning following the hurricane impact in October, would suggest employment gains will be above consensus expectations.

Realtor.com Reports Active Inventory Up 25.9% YoY

What this means: On a weekly basis, Realtor.com reports the year-over-year change in active inventory and new listings. On a monthly basis, they report total inventory. For November, Realtor.com reported inventory was up 26.2% YoY, but still down 21.5% compared to the 2017 to 2019 same month levels. 

 Now - on a weekly basis - inventory is up 25.9% YoY.

Realtor.com has monthly and weekly data on the existing home market. Here is their weekly report: Weekly Housing Trends View—Data for Week Ending Nov. 30, 2024
Active inventory increased, with for-sale homes 25.9% above year-ago levels

For the 56th consecutive week, the number of homes for sale has increased compared with the same time last year. However, this week’s growth was smaller than last week’s, marking the ninth consecutive week of deceleration and tied for the smallest annual increase since late March. Sluggish listing activity, combined with subdued buyer demand, has contributed to this slowdown in inventory growth.

New listings—a measure of sellers putting homes up for sale—plummeted 29% during an idle Thanksgiving week

The number of newly listed homes plummeted 29% last week. While some of the drop may be due to a mortgage rate environment that remains persistently high, most of the large decrease is likely due to the Thanksgiving holiday as sellers are likely deciding to hold off listing their home until buyers are less occupied with their holiday festivities.
Realtor YoY Active ListingsHere is a graph of the year-over-year change in inventory according to realtor.com

Inventory was up year-over-year for the 56th consecutive week.  

However, inventory is still historically low.

New listings remain below typical pre-pandemic levels.

Transitions, NASA, and next steps

It’s easy to imagine a long list of reasons transitions cause distress and terminal distraction. Evolution passed along in us only so much appetite for risk, novelty, and shifts away from the familiar. Is that mysterious cave calling? Become lunch. Alternately, too much of a taste for stability could prove fatal, too. Leave those well-known … Continue reading Transitions, NASA, and next steps

Why are no trillion dollar companies being created in Europe?

That is the theme of a new Substack by Pieter Garicano, here is one excerpt:

These answers, according to a recent paper by Olivier Coste and Yann Coatanlem, two French investors, miss the point: the reason more capital doesn’t flow towards high-leverage ideas in Europe is because the price of failure is too high.

Coste estimates that, for a large enterprise, doing a significant restructuring in the US costs a company roughly two to four months of pay per worker. In France, that cost averages around 24 months of pay. In Germany, 30 months. In total, Coste and Coatanlem estimate restructuring costs are approximately ten times greater in Western Europe than in the United States…

Consider a simple example. Two large companies are considering whether to pursue a high risk innovation. The probability of success is estimated at one in five. Upon success they obtain profits of $100 million, and the investment costs $15 million.

One of the companies is in California, where if the innovation fails the restructuring costs $1 million. The other company is in Germany, where restructuring is 10x more expensive, it costs $10 million (a conservative estimate).

The expected value of this investment in California is a profit of $5 million. In Germany the expected value is a loss of $3 million.

Recommended.

The post Why are no trillion dollar companies being created in Europe? appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Live coverage: NASA leadership provides update on the state of the Artemis program

Thursday assorted links

1. A corporate comms joke.

2. These South Korean Catholic priests should be bloggers.

3. Zero-based regulation by James Broughel.  And from Alex Adams.

4. RLHF propaganda posters, funny stuff.

5. Scientific breakthroughs of 2024.

6. Weather forecasting breakthroughs with AI.

7. Scott Sumner on Cowen and Tabarrok on money.  The key I think is to have a theory that explains why nominal variables sometimes forecast very well, and other times not well at all — this is very hard!  I read Scott as significantly overrating the forecasting power of the nominal in the data.

The post Thursday assorted links appeared first on Marginal REVOLUTION.

       

Comments

 

Asking Rents Mostly Unchanged Year-over-year

Today, in the Real Estate Newsletter: Asking Rents Mostly Unchanged Year-over-year

Brief excerpt:
Another monthly update on rents.

Tracking rents is important for understanding the dynamics of the housing market. Slower household formation and increased supply (more multi-family completions) has kept asking rents under pressure. ...

RentWelcome to the December 2024 Apartment List National Rent Report. The national median rent dipped by 0.8% in November, as we get further into the slow season for the rental market. Nationwide rent fell $12 to $1,382, and we’re likely to see that number dip one more time before the year ends. ...

Realtor.com: 15th Consecutive Month with Year-over-year Decline in Rents

In October 2024, the U.S. median rent continued to decline year-over-year for the fifteenth month in a row, down $14 or -0.8% year-over-year for 0-2 bedroom properties across the top 50 metros, faster than the rate of -0.5% seen in September 2024.

The hidden cost of Chinese loans

Governments that borrow from China must pay more to borrow from others

Trade Deficit decreased to $73.8 Billion in October

The Census Bureau and the Bureau of Economic Analysis reported:
The U.S. Census Bureau and the U.S. Bureau of Economic Analysis announced today that the goods and services deficit was $73.8 billion in October, down $10.0 billion from $83.8 billion in September, revised.

October exports were $265.7 billion, $4.3 billion less than September exports. October imports were $339.6 billion, $14.3 billion less than September imports.
emphasis added
U.S. Trade Exports Imports Click on graph for larger image.

Exports and imports decreased in October.

Exports are up 1.9% year-over-year; imports are up 4.4% year-over-year.

Both imports and exports decreased sharply due to COVID-19 and then bounced back - imports and exports have generally increased recently.

The second graph shows the U.S. trade deficit, with and without petroleum.

U.S. Trade Deficit The blue line is the total deficit, and the black line is the petroleum deficit, and the red line is the trade deficit ex-petroleum products.

Note that net, exports of petroleum products are positive and have been increasing.

The trade deficit with China increased to $28.0 billion from $25.7 billion a year ago.

It is possible some importers are trying to beat potential tariffs.

Weekly Initial Unemployment Claims Increase to 224,000

The DOL reported:
In the week ending November 30, the advance figure for seasonally adjusted initial claims was 224,000, an increase of 9,000 from the previous week's revised level. The previous week's level was revised up by 2,000 from 213,000 to 215,000. The 4-week moving average was 218,250, an increase of 750 from the previous week's revised average. The previous week's average was revised up by 500 from 217,000 to 217,500.
emphasis added
The following graph shows the 4-week moving average of weekly claims since 1971.

Click on graph for larger image.

The dashed line on the graph is the current 4-week average. The four-week average of weekly unemployment claims increased to 218,250.

The previous week was revised up.

Weekly claims were close to the consensus forecast.

SpaceX launches latest broadcast satellite for SiriusXM on Falcon 9 rocket from the Kennedy Space Center

This rendering shows what the Maxar-built SXM-9 will look like once on orbit. Image: Maxar

With a Thursday late morning liftoff, SpaceX sent the latest satellite in the radio broadcast fleet owned by SiriusXM on geostationary transfer orbit trajectory.

The SXM-9 satellite lifted off on a Falcon 9 rocket off from NASA’s Kennedy Space Center at the opening of the 90-minute launch window at 11:10 a.m. EST (1610 UTC).

The Falcon 9 first stage booster for this mission, tail number B1076 in the SpaceX fleet, launched for a 19th time. It previously supported the launches of CRS-26, Intelsat IS-40e, Ovzon 3 and 10 previous Starlink missions.

Nearly 8.5 minutes after liftoff, B1076 touched down on the SpaceX droneship, ‘Just Read the Instructions.’ This marked the 101st booster landing on JRTI and the 379 booster landing to date.

A rendering of the SXM-9 mission patch. Graphic: SpaceX

Maxar Technologies (NSYE: MAXR) is once again the satellite manufacturer behind the SXM-9 satellite. The company has built satellites for SiriusXM (NASDAQ: SIRI) going back to 2000 when it manufactured the first-generation Sirius satellites.

SXM-9 is built on Maxar’s1300 series bus as part of a deal that was announced by Maxar back in August 2021.

“Maxar’s 1300-class platform has served as a reliable spacecraft platform for decades, and we’re glad to see SiriusXM will rely on its performance once again,” said Robert Curbeam, Maxar Senior Vice President of Space Capture, in a 2021 statement. “We’re looking forward to continuing our decades-long relationship with SiriusXM.”

This was the second satellite launched for SiriusXM since the on-orbit failure of SXM-7, which launched back in December 2020. The roughly 8.2-meter-tall (27 ft), 7,000 kg (15,432 lb) SXM-8 launched in June 2021 and completed on-orbit testing a month later.

SXM-9 has the same dimensions and features a 9 m (29.5 ft) diameter unfurl able reflector antenna, which was manufactured by L3Harris Technologies. L3Harris was also tapped to add its antenna to the forthcoming SXM-10 satellite as well.

The SXM-9 satellite stands in its launch configuration at a Maxar Space Systems facility prior to being shipped to Florida ahead of its launch. Image: Maxar

Medical Treatments for Transgender Minors--Oral argument in Supreme Court

 Yesterday the Supreme Court heard oral arguments about the Tennessee ban on transgender treatment for minors.

Supreme Ct. Hears Case on Medical Treatments for Transgender Minors
"The Supreme Court heard oral argument in United States v. Skrmetti, a case on whether Tennessee’s ban on transgender medical treatments for minors violated the Equal Protection Clause of the 14th Amendment. Tennessee enacted its law in March of 2023, which stated that there was a “compelling interest” to protect minors from physical and emotional harm by banning health care providers from administering hormone/puberty blockers and surgery to minors for transgender purposes. Transgender minors and their families sued the state, and the Justice Department intervened on their behalf, arguing the law discriminated on the basis of sex. A district court then stopped the ban on hormone and puberty blockers, but the Sixth Circuit Court of Appeals reversed that decision. The Justice Department then appealed to the Supreme Court. Chase Strangio, who argued on behalf of trans minors and their parents, was the first openly transgender lawyer to argue before the Court. 

Opening statement (text compiled from uncorrected Closed Captioning):

"MR. CHIEF JUSTICE, AND MAY IT PLEASE THE COURT, THIS CASE IS ABOUT ACCESS TO MEDICATIONS THAT HAVE BEEN SAFELY PRESCRIBED FOR DECADES TO TREAT MANY CONDITIONS INCLUDING GENDER DYSPHORIA. BUT SB-1 SINGLES OUT AND BANS ONE PARTICULAR USE. IN TENNESSEE THESE MEDICATIONS CAN'T BE PRESCRIBED TO ALLOW A MINOR TO IDENTIFY WITH OR LIVE AS A GENDER INCONSISTENT WITH THE MINOR SEX. IT DOESN'T MATTER WHAT PARENTS DECIDE IS BEST FOR THEIR CHILDREN. IT DOESN'T MATTER WHAT PATIENTS WOULD CHOOSE FOR THEMSELVES, AND IT DOESN'T MATTER IF DOCTORS BELIEVE THIS TREATMENT IS ESSENTIAL FOR INDIVIDUAL PATIENTS. SB 1 CATEGORICALLY BANS TREATMENT WHEN AND ONLY WHEN IT'S INCONSISTENT WITH THE PATIENT'S BIRTH SEX. TENNESSEE SAYS THAT SWEEPING BAN IS JUSTIFIED TO PROTECT ADOLESCENT HEALTH, BUT THE STATE MAINLY ARGUES THAT IT HAD NO OBLIGATION TO JUSTIFY THE LAW AND THAT SB 1 SHOULD BE UPHELD SO LONG AS IT'S NOT WHOLLY IRRATIONAL. THAT'S WRONG. SB 1 REGULATES BY DRAWING SEX-BASED LINES AND DECLARES THAT THOSE LINES ARE DESIGNED TO ENCOURAGE MINORS TO APPRECIATE THEIR SEX. THE LAW RESTRICTS MEDICAL CARE ONLY WHEN PROVIDED TO INDUCE PHYSICAL EFFECTS INCONSISTENT WITH BIRTH SEX. SOMEONE ASSIGNED FEMALE AT BIRTH CAN'T RECEIVE MEDICATION TO LIVE AS A MALE, BUT SOMEONE ASSIGNED MALE CAN. IF YOU CHANGE THE INDIVIDUAL SEX, IT CHANGES THE RESULT. THAT'S A SEX CLASSIFICATION FULL STOP, AND A LAW LIKE THAT CAN'T STAND ON BARE RATIONALITY. HERE TENNESSEE MADE NO ATTEMPT TO TAILOR ITS LAW TO ITS STATED HEALTH CONCERNS. RATHER THAN IMPOSE MEASURED GUARDRAILS SB 1 BANS THE CARE OUTRIGHT NO MATTER HOW CRITICAL IT IS FOR AN INDIVIDUAL PATIENT. THAT IS A STARK DEPARTURE OF PEDIATRIC CARE IN ALL OTHER CONTEXT. SB 1 LEAVES THE SAME MEDICATIONS AND MANY OTHERS ENTIRELY UNRESTRICTED WHEN USED FOR ANY OTHER PURPOSE EVEN WHEN THOSE USES PREVENT SIMILAR RISKS. THE SIXTH CIRCUIT NEVER CONSIDERED WHETHER TENNESSEE COULD JUSTIFY THAT SEX-BASED LINE BECAUSE THE EQUAL PROTECTION CLAUSE REQUIRES MORE, THIS COURT SHOULD REMAND SO THAT SB 1 CAN BE UNDER THE CORRECT STANDARD. I WELCOME THE COURT'S QUESTIONS. 

########

HT: Kim Krawiec

 

Medpage Today summarized the hearings under this headline:

Supreme Court Appears Likely to Uphold Bans on Transgender Care for Minors
— Justices' decision is not expected for several months

How sports gambling became ubiquitous

Europe is at the centre of the industry’s growth

Cronyism is a problem. But not always an economic one

Research on the topic is surprisingly nuanced

France is not alone in its fiscal woes

Deficits look worryingly wide across Europe

MAGA types have a point on debanking

A booming compliance industry is causing problems

Strange beasts

A man with a long beard wearing glasses holding a net against a backdrop of a grassy field and cloudy sky.

‘Tiny problems become big ones when tigers are involved’: the day a young (and reckless) animal keeper danced with death

- by Aeon Video

Watch at Aeon

Xi Jinping’s campaign against gambling is a failure

Chinese citizens go to great lengths to bet

The order of anarchy

Black and white photo of cars in traffic on a multi-lane motorway seen from above.

How San Francisco’s free rides system can help us understand anarchist theory and the work of the late, great James C Scott

- by Reyko Huang

Read at Aeon

Does the Sun return to the same spot on the sky every day?  Does the Sun return to the same spot on the sky every day? 


Before AI replaces you, it will improve you, Philippines edition

Bahala says each of his calls at Concentrix is monitored by an artificial intelligence (AI) program that checks his performance. He says his volume of calls has increased under the AI’s watch. At his previous call center job, without an AI program, he answered at most 30 calls per eight-hour shift. Now, he gets through that many before lunchtime. He gets help from an AI “co-pilot,” an assistant that pulls up caller information and makes suggestions in real time.

“The co-pilot is helpful,” he says. “But I have to please the AI. The average handling time for each call is 5 to 7 minutes. I can’t go beyond that.”

Here is more from Michael Beltran, via Fred Smalkin.

The post Before AI replaces you, it will improve you, Philippines edition appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Trump nominates Jared Isaacman to become the next NASA administrator

President-elect Donald Trump announced Wednesday he has selected Jared Isaacman, a billionaire businessman and space enthusiast who twice flew to orbit with SpaceX, to become the next NASA administrator.

"I am delighted to nominate Jared Isaacman, an accomplished business leader, philanthropist, pilot, and astronaut, as Administrator of the National Aeronautics and Space Administration (NASA)," Trump posted on his social media platform, Truth Social. "Jared will drive NASA’s mission of discovery and inspiration, paving the way for groundbreaking achievements in space science, technology, and exploration."

In a post on X, Isaacman said he was "honored" to receive Trump's nomination.

Read full article

Comments

Why money is important

In a previous post, I referred an Alex Tabarrok and Tyler Cowen discussion of the 1970s inflation, an area where my views are fairly similar. Today I’ll look at their new podcast on the “New Monetary Economics” (which is now pretty old), an area where I have some major disagreements. Indeed, I’ll argue that their discussion mostly misses the point, largely due to confusion over money’s role as a medium of account.

The three economists most associated with the New Monetary Economics were Fischer Black, Eugene Fama, and Robert Hall. I would argue that Earl Thompson was equally important, although perhaps less well known. I also did some work in the NME tradition during the 1980s, including a paper calling for the central bank to peg the price of NGDP futures contracts. Tyler Cowen co-authored a book on the subject with Randall Kroszner.

Unlike Fama and Hall, Fischer Black failed to understand the special role of the medium of account. Under our current system, the monetary base (cash plus bank reserves) is the medium of account. Under the old gold standard, the medium of account was gold, and the US dollar was defined as 1/20.67 ounces of gold. Under that regime, “US dollar” was the unit of account in the United States, pound sterling was the unit of account in Britain, and gold was the medium of account in both countries. However, you could also argue that the US had dual media of account—gold and currency notes—linked by a fixed exchange rate. I wrote an entire book on the subject.

The vast majority of claims that monetary policy is (nominally) unimportant are based on confusion over the role of the medium of account. Let’s review the many ways that people use the term “money”:

  1. Wealth: “Bill Gates has a lot of money”

  2. The money market: Safe, highly liquid short-term debt.

  3. M2: Bank deposits plus cash

  4. Monetary base: Cash plus bank reserves

Monetary skeptics will say, “You can’t even define money.” But you can precisely define the medium of account, which is the monetary base. Then they’ll say “But the monetary base is only a tiny percentage of our vast financial system.” That’s true, but that fact doesn’t have any implications for the effectiveness of monetary policy.

Those who believe that the monetary base must be big in order to have a big effect on the economy may be implicitly taking a Keynesian view of macro. They see monetary policy as a real factor. You stimulate a lot of real activity, and that creates inflation.

Monetarists see monetary policy as a nominal factor. Changes in the supply (and demand) for the medium of account impact the relative price of the medium of account (cash). Inflation is nothing more than a fall in the relative price of the medium of account, a fall in its purchasing power over goods. When prices rise 1%, the purchasing power of a dollar bill falls by 1%. When you see things this way, it immediately becomes apparent that the share of GDP held as base money has no bearing on the effectiveness of monetary policy.

It might be helpful to think of a non-monetary analogy. Consider a commodity that is a sizable share of GDP, say crude oil. Now consider another commodity that is a small share of GDP, say kiwi fruit. Obviously, oil shocks have a bigger impact on GDP than kiwi market shocks, in a real sense. This is probably why people (wrongly) assume that the monetary base’s small share of our financial system is important. In a real sense, big commodities really are far more important than small commodities.

But in a nominal sense, the two commodities are quite similar. A big increase in oil supply will often depress oil prices (depending on demand). But it’s equally true that a big increase in kiwi fruit supply will generally depress kiwi fruit prices. In both cases, the purchasing power of the commodity in question will decline, relative to other goods and services. When the relative price of oil falls sharply, a barrel of oil buys fewer goods and services. The same is true when the relative price of kiwi fruit falls sharply. The laws of supply and demand don’t stop working because a commodity is small.

It doesn’t matter whether the monetary base is 10% of GDP, 1% of GDP, or 0.001% of GDP. The central bank has a monopoly on base money, and by affecting the supply of base money (through open market operations) and the demand for base money (through interest on reserves), it can dramatically impact the relative price of base money, aka the price level of goods and services. What makes money special is that all other goods are priced in terms of money, not in terms of oil or kiwi fruit. So it matters a lot (in a nominal sense) when the value of money falls.

BTW, under the old gold standard, gold stocks were an even smaller share of GDP than currency. So if the monetary base doesn’t matter because it is small, then all the standard models of the gold standard are also wrong.

The Fed now tries to adjust the supply and demand for base money in such a way as to depreciate the dollar bill at roughly 2% a year, on average. Under the 19th century gold standard, the average inflation rate was roughly zero, as the global gold supply rose at approximately the same rate as gold demand. During the 1970s (before inflation targeting), we had high inflation because the supply of base money rose much faster than demand. Since 1992, inflation has averaged about 2%, with a brief but notable overshoot in 2021-22. Coincidence? I don’t think so. The Fed made that happen.

It’s clear that the financial markets agree with me (and Fama and Hall and Thompson) and disagree with Black. Even hints of a modest shift in Fed policy often cause large swings in asset prices.

Even worse, if monetary policy really were ineffective, then it’s not even clear the Fed could control interest rates. Consider two possibilities:

  1. The Fed has no ability to control interest rates.

  2. The Fed can control interest rates, but this doesn’t matter.

Is either one of those the argument that you wish to make?

Some might argue that they can control nominal interest rates, but not real interest rates. But (as we’ll see in a moment) proponents of that view don’t seem to understand its implication. That sort of claim implies that the Fed can control inflation; indeed it implies that inflation moves one for one with nominal interest rate movements engineered by the Fed. That’s basically what people mean by “NeoFisherism”. There’s an even more bizarre form of monetary skepticism, which suggests that the Fed can control both real and nominal interest rates, and yet monetary policy is still ineffective. This view is associated with MMTers.

[As an aside, there is a completely separate sense in which monetary policy might be ineffective. It is possible that in a world of flexible wages and prices, the Fed could control nominal variables, but have no impact on real variables such as output and employment. This extreme “real business cycle” perspective is probably wrong, but even if it were true it has no bearing on Fischer Black’s claim that monetary policy doesn’t even affect nominal variables.]

Part 2: Reply to Tabarrok/Cowen

Monetary skeptics often skip around from one issue to another, and it can be difficult to pin down the exact source of disagreement. For instance, Tabarrok discusses an economy with no money:

The first is I think there’s no role for monetary policy. Well, why not? Well, there’s no money. There’s only financial assets. The Fed in this world, they could sell T-Bills or buy T-Bills, but so what? Selling T-Bills in this world is just swapping one financial asset for another asset. Swapping assets doesn’t change the real structure of the economy. It’s just a financial change, not a real change.

It’s not clear exactly what sort of system is being envisioned. Is there literally no medium of account? Is it a barter economy? There have been a number of papers discussing the possibility of a “moneyless economy”, but on closer inspection those generally do still feature a medium of account, often settlement balances at the central bank. An excellent paper by Bennett McCallum explores some of these issues. From my perspective, that’s not actually a moneyless economy, and monetary policy remains highly effective. In pure barter, there is no money and no monetary policy.

Alex continues:

Now the second important thing about this world is that it actually seems very close to our world. Most of my transactions accounts, the accounts which I can easily spend, they’re actually invested in bonds. Now, it is true when I buy something I don’t literally transfer $100 worth of bonds, instead I sell $100 worth of bonds, I transfer the $100, and then the person on the other end probably takes the $100 and invests it in bonds.

It’s only slightly different from the world that I described. There’s a few transactions into dollars and then dollars into bonds, but mostly we’re just trading bonds.

That “slight difference” is actually all-important. Even if base money is only a tiny percentage of the financial system, changes in the value of base money impact all nominal variables, including nominal GDP. No matter how small the ratio of MB/NGDP, the Fed has almost unlimited control over the value of the dollar and thus NGDP.

Do you think that in our sophisticated financial system, a system in which trillions of dollars of assets are exchanged every day and people on Wall Street care about reducing trading times by microseconds—this is the world in which we routinely swap interest rates, exchange rates, put options, call options, take out options on all kinds of eventualities—in this world, you’re telling me that the power of the Fed, which many people see as tremendous, huge, you’re telling me that the power of the Fed ultimately rests on the fact that people pay for candy bars with cash, or that we trade bonds in three steps instead of one? Put that way, Fischer Black and Fama seemed to be onto something.

Yes, I really think that.

Cowen then corrects Tabarrok, who mischaracterized Fama’s views, but then he mischaracterizes my view:

COWEN: I agree with what you’re saying. I would draw somewhat of a distinction between Black and Fama. Black just thought money didn’t matter. Fama, like actually Scott Sumner still believes today, he thinks or thought in 1980—Sumner still thinks it—that currency was a unique lever that could force the whole system of prices up or down in a simple quantity theory relationship. Like you, I just don’t believe that.

Note that Tabarrok and Cowen aren’t really discussing the New Monetary Economics, which mostly accepts the crucial role of the medium of account; they are discussing the heretical views of Fischer Black. Fama and Hall shared my belief that the monetary base is a “unique lever”.)

As for the “simply quantity theory relationship”, that’s the view that V is constant and NGDP moves in proportion to M. I don’t know of a single economist that holds that view. I favor adjusting M to offset V, in order to keep NGDP growing along a stable path. Many mainstream economists have recently joined this club, and I wouldn’t describe people like Summers, Romer and Woodford as simple quantity theorists.

[Of course they would say they favored a policy of moving interest rates in such a way that M moved in such a way as to offset movements in V. But if you favor NGDP targeting, you must implicitly favor a stable path of M*V. Right??]

Perhaps Tyler is confused because when I introduce people to monetary economics, I often begin with the simple quantity theory as a sort of intuition pump, before moving on to more realistic models.

Tyler continues:

It would be like saying, “Well, if we doubled the supply of nickels and we used nickels with the price level doubled, maybe some prices would go up where you use spare change,” but it’s not worth arbitraging nickels enough for that to affect the whole economy.

Again, this may reflect Tyler misinterpreting something I said in The Money Illusion. It is true that just dumping lots of nickels on the economy would have little effect. You’d have a surplus (or shortage if you suddenly removed them.) But if you adjust monetary policy in such a way that the equilibrium quantity of coins changes, then the effects would be very large. It’s a subtle distinction, and probably one that many readers missed.

Now my view isn’t the Fed doesn’t matter at all. In my view, liquidity is jointly produced by the private sector and by the Fed, but the Fed matters much less over time. It has a much smaller role in producing the total liquidity of our stocks of wealth and thus total lines of credit.

I see absolutely no evidence for that claim. The base is a far larger share of GDP than in the past. Perhaps you wish to restrict the Fed’s footprint to the currency stock (as reserves are bloated due to IOR). But even currency in circulation is now a larger percentage of GDP (roughly 8%) than it was back when I was born (7% in 1955.) And of course the Fed still has an enormous impact on the broader financial markets.

The Fed can matter at extreme margins if they raised interest rates high enough. Clearly that would matter. There’s different ways they could let the financial system blow itself up, that would matter. At most relevant margins, the private financial sector can offset changes in what the Fed does if it wants to, it may or may not want to, but there’s no simple layer of control.

Interest rates are not monetary policy. It’s true that IOR is one tool of monetary policy, but what actually matters is the supply and demand for base money. To suggest that higher interest rates represent a change in monetary policy is to engage in ”reasoning from a price change.” What matters is not what happens to interest rates; rather what matters is the thing that caused interest rates to change. Rates may rise because of tight money (liquidity effect), or easy money (income and Fisher effects.)

No firm or individual tries to “offset” monetary policy, they try to maximize profits, or maximize utility. Go back to the pre-IOR era, before 2008 (a time when the base was 98% currency), to make things simple. When the Fed did a large exogenous open market purchase that boosted the monetary base by 10%, the public did not try to “offset” that policy. If the public preferred to hold 8% of GDP as currency, they would not engage in the following sort of reasoning:

“Hmmm, I see the Fed increased the base by 10%. They seem to be trying to boost NGDP by 10%. We can all collude to thwart this policy by boosting our preferred currency holdings to 8.8% of GDP, instead of the previous 8% of GDP.”

Why would people do that? Rather, the public would be annoyed at the temporary surge in cash balances, beyond what they normally hold, They would attempt to get rid of the excess cash balances by spending them on goods, services, and financial assets, eventually driving up AD and NGDP.

I suppose that proponents of this offset hypothesis usually have in mind a situation where the public thwarts a tight money policy by issuing close substitutes for Fed money. But if those substitutes were profitable, why weren’t they issued before the tight money policy? In any case, the Fed could take the likely private sector response into account, and figure that a 10% reduction in base money might only reduce spending by say 5% in the short run, as money substitutes filled some of the gap (i.e. reduced the demand for base money.) No one is suggesting that the simple quantity theory applies to the short run dynamics. But this fact doesn’t prevent monetary policy from being effective.

If you look at the data, like Milton Friedman’s monetarism, which is coming of age in the 1960s, well, for the decades before the mid-1960s, the relationship between money and nominal income it was pretty stable. Friedman was right when he wrote it. There’s a reason why he persuaded so many people, but there’s then a series of ongoing, ever more serious blows to the monetarist regularities.

It starts in the 1980s where the relationship between prices and money supply breaks down. You have the 2008 financial crisis, where the Fed increases bank reserves quite a bit, some people expect hyperinflation, hyperinflation doesn’t come. Well, there’s interest paid on reserves. It’s a very complicated story, but most methods of unpacking that story, I think, are going to support the Black view. The money supply is not a simple thing.

I know that’s the standard view. But Friedman usually focused on M2 (which is not my preferred index.) Note that M2 velocity actually was not unusually unstable during the 1980s, apart from a significant (and normal) decline during the severe 1982 recession.

Is the 1980s velocity collapse another one of those apocryphal stories, like the myth that supply shocks created the 1970s inflation? Or the myth that we ran big budget deficits during the 1960s? Or the myth that the subprime fiasco caused the Great Recession? Or the many “bubble” myths? I wonder.

And as far as hyperinflation predictions, if I actually were a simple-minded quantity theorist, I probably would have predicted high inflation after 2008. But of course I’m not, and I didn’t. I predicted below target inflation, which we had.

TABARROK: Right. Yes, it seems very difficult to believe that currency still rules the roost when currency is such a small amount of the money supply and most of the currency is not even in the United States.

COWEN: That’s right, or it’s in the drug trade.

As we’ve seen, currency is actually an increasing share of GDP. A better argument (which Tyler alludes to) is that there is less currency involved in transactions, and more of it is being hoarded.

But for monetary policy, it doesn’t matter where the currency is located. All that matters is the supply and demand for the medium of account. That’s what determines the value of the medium of account, in our case the US dollar. As an analogy, when we were on the gold standard you could envision a system where 90% of gold was used for jewelry and 10% for money, or a system where it was 90% money and 10% jewelry. In either case, changes in the supply and demand for gold determine its value in much the same way.

Again, it’s easier to see this concept using the pre-2008 system, where the monetary base was 98% currency. The price level equals the nominal quantity of base money (which is determined by the Fed) divided by the real demand for base money (determined by the public). That gives the Fed almost unlimited control over the price level, unless they run out of eligible assets to buy.

In the post-2008 system, the correlation between the supply of base money and the price level is far weaker, but the Fed has an additional tool that can also impact the real demand for base money, the rate of interest paid on bank reserves. Between changes in the supply of base money (via QE) and the demand for base money (via IOR) they continue to have unlimited ability to affect the value of money—its purchasing power—and hence all nominal variables.

TABARROK: That totally changes how you have to teach monetary policy, because there’s no money multiplier the way that people used to talk about it. Instead what the Fed did in 2009, they started paying interest on reserves and now, banks had always tried to keep the reserves as low as possible, because you’re not making any interest on your reserves. Now the Fed is paying interest on reserves, so reserves go from billions to literally trillions overnight. The banks are now holding trillions in reserves, which are no different than T-bills, right?

COWEN: Well, monetary policy is fiscal policy. It’s like the Fed has the right to issue something—they would never want to hear it called this—like a little Fed mini T-bill. They create reserves, they pay interest on it. In essence, it’s like a government security, which is backed by the Fed and Treasury as a combined entity. You might think it’s a good thing for the Fed to do, but it literally is fiscal policy.

No, T-bills are not like reserves, because they are not the medium of account. And I don’t think it’s useful to call this “fiscal policy”.

Obviously, any monetary policy has fiscal consequences. That’s always been true. Printing money is profitable, it creates inflation tax revenue. But it is still very useful to discriminate between stimulus actions that are expected to create future tax liabilities (deficit spending), and those that (on average but not always) reduce future tax burdens (money printing.)

Cowen continues:

Fiscal policy can matter, no one denies that, but again, it’s just a very different, very strange world. The other change is it used to be that so much lending was done through the banking system in the United States. Now, by some estimates, banks in the formal sense account for about 20 percent of the lending. If the Fed is operating through banks and that’s 20 percent of the system, why should that be so important? Indeed, probably it isn’t, it still has a role, but banks are likely to continue shrinking in importance.

The Fed doesn’t operate “through the banking system”; it operates by changing nominal aggregates such as GDP using tools like OMOs and IOR. Even if the banking system completely disappeared, the Fed could easily keep prices growing along a 2% path through suitable adjustments in the growth rate of the currency stock (by assumption there would be no bank reserves). It would continue to use open market operations to adjust the currency stock.

COWEN: That’s right, which may be a good thing to do, but it causes you to rethink what’s the Fed, actually? The global money supply, not even a well-defined concept, but there’s something global that does matter, right?

Again, you need to think in terms of the medium of account. Is the global money supply the total of all types of money, or the global supply of one specific asset—US dollars? The latter does matter, the former does not.

COWEN: Timothy Fuerst had a good paper on this I think in 1992. There’s a well-known paper by Kevin Grier, coauthors. They show there’s some effect that the Fed can influence real interest rates, but how hard they had to work to show any effect at all. Now, you have this period, the ZIRP period right after 2008, 2009 where the Fed is by one measure, at least—with apologies to Scott Sumner—extremely expansionary. Even real interest rates seem to be below zero for an extended period of time.

Now, there’s the question of how should we interpret that era? I think the huge mistake is to generalize from that era. People just think the Fed can put real interest rates wherever they want, because the Fed, it seems, gave us negative real rates for a decade, but that, too, was a temporary thing.

Tyler’s right that the Fed has relatively little impact on real interest rates, but of course real interest rates have little or nothing to do with monetary policy. Tyler seems to be suggesting that monetary policy was “extremely expansionary” because of negative real interest rates. (Unless he was referring to QE.)

Here it will be helpful to review why interest rates (both real and nominal) are a horrible indicator of the stance of monetary policy, quite nearly the very worst. Indeed use of interest rates is almost a textbook example of reasoning from a price change.

Let’s start with nominal interest rates. Unless your name is Joan Robinson, you probably don’t wish to argue that hyperinflation in economies like Argentina, Venezuela and Zimbabwe cannot possibly have been caused by monetary policy, as interest rates were not low. In case you don’t know what I’m referring to, here’s AI Overview:

English economist Joan Robinson believed that easy money could not have caused German hyperinflation because interest rates were not particularly low. Robinson was a follower of John Maynard Keynes, who argued that monetary policy could only impact demand by changing interest rates.

When I make this argument, people often shift to the claim that, “Yes, nominal interest rates are misleading due to inflation, but surely real rates are a good indicator of the stance of monetary policy.” Nope. Real interest rates are flawed for essentially the same reason. Just as swings in expected inflation cause changes in the equilibrium nominal interest rate, swings in GDP growth, investment booms and busts, and financial distress cause big swings in the equilibrium real interest rate.

Ben Bernanke understood that neither nominal nor real rates are a good indicator of the stance of policy:

The imperfect reliability of money growth as an indicator of monetary policy is unfortunate, because we don’t really have anything satisfactory to replace it. As emphasized by Friedman (in his eleventh proposition) and by Allan Meltzer, nominal interest rates are not good indicators of the stance of policy, as a high nominal interest rate can indicate either monetary tightness or ease, depending on the state of inflation expectations. Indeed, confusing low nominal interest rates with monetary ease was the source of major problems in the 1930s, and it has perhaps been a problem in Japan in recent years as well. The real short-term interest rate, another candidate measure of policy stance, is also imperfect, because it mixes monetary and real influences, such as the rate of productivity growth. . . .

Ultimately, it appears, one can check to see if an economy has a stable monetary background only by looking at macroeconomic indicators such as nominal GDP growth and inflation. On this criterion it appears that modern central bankers have taken Milton Friedman’s advice to heart.

I’m pretty sure that Bernanke mentioned NGDP because he immediately recognized that people would object that supply shocks can impact inflation for non-monetary reasons. That’s why NGDP is best, and this indicator suggests that in 2008-09 the Fed gave us the tightest monetary policy since Herbert Hoover was in office. NGDP growth plunged by 8 percentage points.

But let’s say I’m wrong, and that real rates are the correct indicator of the stance of policy. Then why weren’t economists freaking out during the fall of 2008, when real rates on risk free government bonds rose sharply higher? Indeed, why did almost everyone ignore that indicator?

Because they were all focused on sharply falling nominal interest rates.

This was followed by a discussion of cryptocurrencies. These assets are certainly quite interesting, and perhaps increasingly important. But they are not important for monetary policy because cryptocurrencies are not a significant medium of account, at least in the macroeconomically important goods and labor markets. Perhaps at some point in the future they (especially stablecoins) will eat into demand for base money, and the Fed will be forced to respond with open market sales or higher IOR, in order to prevent high inflation. But we are not there yet.

TABARROK: The Modigliani-Miller theorem says—I’m going to give you a philosophical approach first and then we’ll look in more detail—but philosophically it says, “Look beyond the veil of debt and equity to the productive capital underneath.” Ultimately, both the firm’s debt and equity are owned by households, but clearly aggregate household’s consumption is determined not by the labels which we assign to the productive capital, to the earnings of the productive capital, but to the productive capital itself.

Modigliani-Miller was saying that for a firm, it doesn’t matter. The value of the firm does not depend upon whether it has got a lot of debt and a lot of equity, or a lot of equity and a lot of debt. The debt-equity ratio doesn’t matter. They also say that is true for the economy as a whole, debt and equity they’re just names which we put on these different streams which are all coming from the underneath, the capital, the tractors, the buildings, the roads, the human capital. That’s what’s important.

Of course there are real world complications, but I’m willing to accept all of that for the sake of argument. What are the implications for monetary policy? I’d say this model perfectly aligns with the simple quantity theory of money. Under M-M, if you do a 2 for 1 stock split, then each $80 share is replaced with two $40 shares. The number of shares doubles. The aggregate underlying real value of the firm is unchanged, but each share of stock has only half the real value it had before the stock split.

In the simple QTM, if you double the money supply and real GDP is unchanged, then each dollar bill has only half the purchasing power it had before the money supply doubled. It’s no different from replacing a 36-inch yardstick with an 18-inch measuring stick. Each object you measure looks twice as long.

So the M-M approach to finance perfectly aligns with the conventional models of monetary policy, and yet Tabarrok (and Cowen?) draws exactly the opposite conclusion:

TABARROK: Yes. Again, this is another way of coming at the new monetary economics because it says if you apply Modigliani-Miller to the economy as a whole, then the only thing that the Fed is doing is selling you 2 percent milk or whole milk, right?

COWEN: That’s right, and you can remix.

TABARROK: Yes, you can remix.

COWEN: Borrow more, or borrow less.

TABARROK: Right. Again, the real asset is what counts, not how those payment streams are divvied up. What role is there for monetary policy in a Modigliani-Miller world?

Where do they go wrong? Perhaps they are implicitly assuming that the monetary skepticism of people like Fischer Black is about whether monetary policy can affect real variables. No so. Black-style monetary skepticism is about whether central banks can control nominal variables, i.e., the purchasing power of a dollar bill. And just as a corporation can affect the purchasing power of a share of stock by changing the quantity of shares in circulation, a central bank can change the purchasing power of a dollar bill by changing the quantity of dollars in circulation. Of course there may be cases where the simple quantity theory doesn’t apply, notably at the zero lower bound, but that doesn’t have anything to do with Modigliani-Miller, which should also apply to monetary policy during “normal times”, if Alex is correct.

There are two different senses in which monetary policy might not matter. It might not matter in nominal terms, especially if the Fed were swapping base money for another asset that was a perfect substitute. But in that case, open market operations wouldn’t even impact nominal interest rates and the pre-2008 Fed would have had no control over market interest rates. AFAIK, even MMTers don’t believe that!

The other sense in which money might not matter is that it might be neutral, even in the short run. An exogenous X% increase in the money supply might cause all nominal values to immediately rise by X%, leaving all real values unchanged. But that’s clearly not true, due to sticky wages and prices. And that’s not the issue Tabarrok and Cowen examined in their discussion.

If you closely follow the way that financial markets react to policy news, it’s pretty clear that they don’t buy into Black-style monetary skepticism. This was especially true during the interwar period, when policy was far more erratic and the monetary shocks were much bigger and easier to identify. Markets also reject MMTism, and I’d say they reject extreme NeoFisherian claims that higher nominal interest rates represent monetary easing. Markets also reject fiscal theory of the price level claims that fiscal policy determines the path of inflation and monetary policy is ineffective.

Some market reactions are consistent with Keynesian theory, but not all. In my view, market responses to monetary policy shocks are most consistent with market monetarism, but then I would say that, wouldn’t I? :)

PS. Why are our coins and currency notes becoming increasing ugly?

Using AI to analyze changes in pedestrian traffic

That is the topic of my latest Bloomberg column, here is one bit:

Fortunately, there is new research. We have entered the age where innovative methods of measurement, such as computer vision and deep learning, can reveal how American life has changed.

Researchers at the National Bureau of Economic Research compiled footage of four urban public spaces, two in New York and one each in Philadelphia and Boston, from 1979-1980 and again in 2008-2010. These snapshots of American life, roughly 30 years apart, reveal how changes in work and culture might have shaped the way people move and interact on the street.

The videos capture people circulating in two busy Manhattan locations, in Bryant Park in midtown and outside the Metropolitan Museum of Art on the Upper East Side; around Boston’s Downtown Crossing shopping district; and on Chestnut Street in downtown Philadelphia. One piece of good news is that at least when it comes to our street behavior, we don’t seem to have become more solitary. From 1980 to 2010 there was hardly any change in the share of pedestrians walking alone, rising from 67% to 68%.

A bigger change is that average walking speed rose by 15%. So the pace of American life has accelerated, at least in public spaces in the Northeast. Most economists would predict such a result, since the growth in wages has increased the opportunity cost of just walking around. Better to have a quick stroll and get back to your work desk.

The biggest change in behavior was that lingering fell dramatically. The amount of time spent just hanging out dropped by about half across the measured locations. Note that this was seen in places where crime rates have fallen, so this trend was unlikely to have resulted from fear of being mugged. Instead, Americans just don’t use public spaces as they used to. These places now tend to be for moving through, to get somewhere, rather than for enjoying life or hoping to meet other people. There was especially a shift at Boston’s Downtown Crossing. In 1980, 54% of the people there were lingering, whereas by 2010 that had fallen to 14%.

Consistent with this observation, the number of public encounters also fell. You might be no less likely to set off with another person in tow, but you won’t meet up with others as often while you are underway. The notion of downtown as a “public square,” rife with spontaneous or planned encounters, is not what it used to be.

I prefer the new arrangements, but of course not everybody does.  The researchers are Arianna Salazar-MirandaZhuangyuan FanMichael B. BaickKeith N. HamptonFabio DuarteBecky P.Y. LooEdward L. Glaeser Carlo Ratti.

The post Using AI to analyze changes in pedestrian traffic appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

DSQL Vignette: Reads and Compute

DSQL Vignette: Reads and Compute

The easy half of a database system?

In today’s post, I’m going to look at half of what’s under the covers of Aurora DSQL, our new scalable, active-active, SQL database. If you’d like to learn more about the product first, check out the official documentation, which is always a great place to go for the latest information on Aurora DSQL, and how to fit it into your architecture. Today, we’re going to focus on running SQL and doing transactional reads.

But first, let’s talk scalability. One of the most interesting things in DSQL’s architecture is that we can scale compute (SQL execution), read throughput, write throughput, and storage space independently. At a fundamental level, scaling compute in a database system requires disaggregation of storage and compute. If you stick storage and compute together, you end up needing to scale one to scale the other, which is either impossible or uneconomical.

That’s why, when we launched Aurora 10 years ago (nearly to the day!) we chose an architecture which separated compute and storage (from Amazon Aurora: Design Considerations for High Throughput Cloud-Native Relational Databases, SIGMOD’17):

As the paper says:

We use a novel service-oriented architecture (see Figure 1) with a multi-tenant scale-out storage service that abstracts a virtualized segmented redo log and is loosely coupled to a fleet of database instances.

In DSQL, we took this pattern one step further: we changed the interface between the SQL executor and storage to remove the need for a large local cache1 right next to the SQL engine. With that out of the way, we could build a new scalable SQL execution layer which can dynamically scale to meet the needs of nearly any workload.

Compute Scale: Lessons from Lambda

Aurora wasn’t the only big launch at re:Invent 2014. Another big one was AWS Lambda2. AWS Lambda brought a new compute scalability model: the ability to scale up efficiently in small units, each with a single well-defined task to do. Since launching Lambda, we’ve learned a lot about how to do fast, efficient, and dynamic compute scalability, and built some really cool technology to make it happen, like the Firecracker VMM3. Firecracker, and all those lessons from building and operating Lambda, allowed us to build a new kind of compute data plane for Aurora DSQL.

Each transaction inside DSQL runs in a customized Postgres engine inside a Firecracker MicroVM, dedicated to your database. When you connect to DSQL, we make sure there are enough of these MicroVMs to serve your load, and scale up dynamically if needed. We add MicroVMs in the AZs and regions your connections are coming from, keeping your SQL query processor engine as close to your client as possible to optimize for latency6.

We opted to use PostgreSQL here because of it’s pedigree7, modularity, extensibility, and performance. We’re not using any of the storage or transaction processing parts of PostgreSQL, but are using the SQL engine, an adapted version of the planner and optimizer, and the client protocol implementation.

Doing Reads

Each DSQL query processor (QP) is an independent unit, that never communicates with other DSQL QPs. On the other hand, DSQL offers strongly consistent, strongly isolated, ACID transactions, which typically requires maintaining lock or latch state across all the compute nodes in the database. In a future post, we’ll get into the detail of how this works, but for now let’s talk about reads.

START TRANSACTION;
SELECT name, id FROM dogs ORDER BY goodness DESC LIMIT 1;
SELECT stock, id FROM treats WHERE stock > 0 ORDER BY deliciousness DESC LIMIT 1;
COMMIT;

This is a read-only transaction. In DSQL, transactions like these are strongly consistent and snapshot isolated4. That means that this transaction needs to get a point-in-time consistent view of the data in both the dogs and treats tables.

To do that, we start every transaction by picking a transaction start time, $\tau_{start}$. We use EC2’s precision time infrastructure which provides an accurate clock with strong error bounds. Then, for each read that the QP does to storage, it asks storage to do that read as of $\tau_{start}$. New writes (with $\tau > \tau_{start}$) can be flowing into the system, we can go to different storage shards or different replicas, but with this interface we’ll always get a consistent view of the state of the database as of $\tau_{start}$. This ensures that we see all transactions committed before $\tau_{start}$, no transactions committed after $\tau_{start}$, no in-flight transactions, and always experience repeatable reads.

At the storage level, these as of reads are implemented using a classic database technique called multiversion concurrency control (MVCC)5, or multiversioning. The storage engine keeps multiple versions of each row, allowing access to old versions (such as the versions most recent as of $\tau_{start}$) without blocking the creation of new versions. In DSQL’s disaggregated distributed architecture this allows us to entirely avoid coordination between replicas on the read path, have as many replicas as we like, and never block other readers or writers on ongoing reads, or readers on ongoing writes.

Another key benefit of this coordination-free approach is that we can send reads to the nearest read replica (in the same region, and generally AZ) to reduce cost and latency. Reads never have to go to a leader or a primary to be sequenced or have their lock state maintained, simply because they don’t have any lock state. This is true in read-only transactions, read-write transactions, and even for the reads triggered by writes (e.g. UPDATE is a read-modify-write).

Avoiding Caching and Coherence

Aurora DSQL uses a logical interface to storage. The QP doesn’t ask for pages, it asks for rows. Knowing the logical structure of the data it holds allows DSQL’s storage to offer quite a high-level interface to the QP: the QP can ask storage to do work like filtering, aggregation, projection, and other common tasks on its behalf. Unlike SQL designs that build on K/V stores, this allows to DSQL to do much of the heavy lifting of filtering and finding data right next to the data itself, on the storage replicas, without sacrificing scalability of storage or compute.

This, in turn, allows us to avoid the scalability bottleneck of having to have a large, coherent, cache8 on-box with SQL execution. In-AZ (or closer) networking, combined with carefully-designed protocols and the ability to push chatty work down, keeps storage fast without the need to cache. We still cache some low-write-rate information (like the list of tables and their definitions).

You can see this in action with EXPLAIN:

explain select key, field0 from usertable where key = 'bob';
                                    QUERY PLAN                                        
------------------------------------------------------------------------------------------
Index Only Scan using usertable_pkey on usertable  (cost=100.17..104.18 rows=1 width=64)
  Index Cond: (key = 'bob'::text)
  Projected via pushdown compute engine: key, field0

Here, the index-only scan on the primary key index on this table (Aurora DSQL tables are index organized) is pushed down to storage, along with the projection of the selected columns. This significantly reduces the number of round-trips between the QP and storage system, with a great impact on performance.

Pushing operations down to storage is a good bet for another reason: Latency Lags Bandwidth. Networks have gotten a lot faster over the last couple decades, but the rate of change of latency has been much slower than the rate of change of bandwidth (partially, this just has to do with speed-of-light limitations). This has been true over multiple decades, and looks set to continue for decades more. That trend means that pushdown, which moves operations close to the storage devices themselves and removes a lot of round-trips, is a good bet for the long-term.

The Big Picture

The overall approach here is disaggregation: we’ve taken each of the critical components of an OLTP database and made it a dedicated service. Each of those services is independently horizontally scalable, most of them are shared-nothing, and each can make the design choices that is most optimal in its domain. This approach is enabled by the extremely fast and reliable networking available in modern data centers, and by designing each component as part of the overall architecture. Tomorrow we’ll go into the write path, which will reveal how the whole picture comes together.

Footnotes

  1. In most database systems, having a large and fast cache next to the SQL execution engine is critical to performance. Managing and sizing this cache dynamically was one of our biggest innovations in Aurora Serverless, which we talk about in our VLDB’24 paper Resource management in Aurora Serverless.
  2. I joined the AWS Lambda team in early 2015, a couple months after being completely blown away by this launch.
  3. You can learn more about Firecracker by checking out our paper Firecracker: Lightweight Virtualization for Serverless Applications from NSDI’20, or checking out the Firecracker source code on Github.
  4. DSQL’s snapshot isolation level is equivalent to PostgreSQL’s REPEATABLE READ isolation level, because PostgreSQL’s REPEATABLE READ level is implemented as snapshot isolation (a good choice by the PostgreSQL folks).
  5. MVCC has been around since the late 1970s, described in David Reed’s 1979 PhD thesis Naming and synchronization in a decentralized computer system and Bernstein and Goodman’s 1981 survey of concurrency control techniques Concurrency Control in Distributed Database Systems. Snapshot isolation has also existed since the 1980s, but the most famous formalization is Berenson et al’s 1995 paper A Critique of ANSI SQL Isolation Levels.
  6. Client interactions with SQL databases are quite chatty, and so tend to be latency sensitive. Most of that is because of the interactive nature of SQL: do some work in the app, do some work in the database, back to the app, back to the database, etc.
  7. Don’t miss Joe Hellerstein’s Looking back at Postgres if you’d like to understand more about the history and pedigree of Postgres.
  8. Exercise for the reader who’s interested in cloud database architecture: why do you think we came to the conclusion that we wanted to avoid a coherent shared cache? Compare that to the conclusions in Ziegler et al’s Is Scalable OLTP in the Cloud a Solved Problem? from CIDR’23 which proposes a very different approach to ours.

Infinite Armada Chess

Stockfish 16 suggests the unconventional opening 1. RuntimeError: Out of bounds memory access

Thursday: Unemployment Claims, Trade Deficit

Mortgage Rates Note: Mortgage rates are from MortgageNewsDaily.com and are for top tier scenarios.

Thursday:
• At 8:30 AM ET, The initial weekly unemployment claims report will be released. The consensus is for 220 thousand initial claims, up from 213 thousand last week.

• Also at 8:30 AM: Trade Balance report for October from the Census Bureau. The consensus is the trade deficit to be $78.8 billion.  The U.S. trade deficit was at $84.4 billion in September.

Wednesday 4 December 1661

To Whitehall with both Sir Williams, thence by water, where I saw a man lie dead upon Westminster Stairs that had been drowned yesterday. To the Temple, and thence to Mr. Phillips and got my copy of Sturtlow lands. So back to the 3 Tuns at Charing Cross, and there met the two Sir Williams and Col. Treswell and Mr. Falconer, and dined there at Sir W. Pen’s cost, and after dinner by water to Cheapside to the painter’s, and there found my wife, and having sat a little she and I by coach to the Opera and Theatre, but coming too late to both, and myself being a little out of tune we returned, and I settled to read in “Mare Clausum” till bedtime, and so to bed.

Read the annotations

The 100 Best Recordings of 2024 (Part 2 of 2)

Over the last year, I’ve devoted hours each day to my obsessive search for outstanding new music. The list below is the result of this ongoing commitment to deep listening.

This is always my biggest project of the year. During the last decade, I’ve listened to more than ten thousand new releases.

Does that sound crazy? Maybe it is.

But this daily immersion into new music lies at the core of my vocation—which is to promote creativity and vitality in the face of the institutional stagnation that’s so damaging to the music business right now (as to so many sectors of the culture).


If you want to support my work, consider taking out a premium subscription—for just $6 per month (even less if you sign up for a year).

Subscribe now


This project is inevitably filled with equal doses of excitement and disappointment. But, once again in 2024, I came away from it with tremendous optimism about the state of contemporary music.

Even if record labels and streaming platforms falter, the musicians won’t let us down.

I found so many creative recordings along the way—many in hidden and unexpected places—and I’ve been looking forward to sharing this body of work with you.

This is the second (and final) installment of the my top 100 recordings of 2024—all genres, all styles, all geographies. (For part one of this survey, click here.)

I’ve included a link to audio or video for each album. I encourage you to ramble and sample.

Happy listening!

P.S. My best-of-year lists for previous years (going back to 2011) are available to premium subscribers in The Vault, which also includes more than 400 essays from my archive.


The 100 Best Recordings of 2024 (Part 2 of 2)

In alphabetical order.

Slobodan Mandić
U Prozoru
Dark Acoustic Guitar Blues instrumentals from Serbia

Matteo Mela & Lorenzo Micheli
Scarlatti: 12 Sonatas
Scarlatti Sonatas Performed on Two Guitars

Amina Mezaache and Maracuja
Vortex
Brazilian Flute for Dancing

Mildlife
Chorus
Australian Psychedelic Funk (with a Tinge of Steely Dan)

Read more

Manufacturing is a war now

Elon Musk recently reposted a video showing a montage of drone swarms in China, declaring that the age of manned fighter jets was over:

I’m not sure if Musk is right about the F-35 and other manned fighters — drones and fighters play different roles on the battlefield, and may coexist in the future (for an argument that the F-35 itself has been overly maligned, watch this fun video). But in any case, Musk’s larger point that drones will dominate the battlefield of the future should now be utterly uncontroversial.

Drones have already become the essential infantry weapon, capable of taking out soldiers and tanks alike, as well as the key spotter for artillery fire and the standard method of battlefield reconnaissance. Electronic warfare — using EM signals to jam drones’ communication with their pilots and GPS satellites — is providing some protection against drones for now, but once AI improves to the point where drones are able to navigate on their own, even that defense will be mostly ineffectual. This doesn’t mean drones will be the only weapon of war, but it will be impossible to fight and win a modern war without huge numbers of drones.

And who makes FPV drones, of the type depicted in Musk’s video? China. Although the U.S. still leads in the production of military drones, China’s DJI and other manufacturers dominate the much larger market for commercial drones:

Source: DroneDJ

And one absolutely essential component of an FPV drone is a battery. In fact, improvements in batteries — along with better magnets for motors and various kinds of computer chips for sensing and control — are what enabled the drone revolution in the first place. And who makes the batteries? That would also be China:

Source: BNEF

So now I want you to imagine what happens if the U.S. and its allies get in a major war with China — as analysts say is increasingly possible. In the first few weeks, much the two countries’ stores of munitions — including drones and the batteries that power drones — will be used up. After that, as in Ukraine, it will come down to who can produce more munitions and get them to the battlefield in time.1

At that point, what will the U.S. do if neither we nor our allies can make munitions in large numbers? We will have to choose to either 1) escalate to nuclear war, or 2) lose the war to China. Those will be our only options. Either way, the U.S. and its allies will lose.

Now realize that the U.S. and its allies aren’t just falling behind China in drone and battery manufacturing — we’re falling behind in all kinds of manufacturing. The chart at the top of this post comes a 2024 report by UNIDO, the United Nations Industrial Development Organization. Here, let me repost it so we can take a look:

Source: UNIDO

In the year 2000, the United States and its allies in Asia, Europe, and Latin America accounted for the overwhelming majority of global industrial production, with China at just 6% even after two decades of rapid growth. Just thirty year later, UNIDO projects that China will account for 45% of all global manufacturing, singlehandedly matching or outmatching the U.S. and all of its allies. This is a level of manufacturing dominance by a single country seen only twice before in world history — by the UK at the start of the Industrial Revolution, and by the U.S. just after World War 2. It means that in an extended war of production, there is no guarantee that the entire world united could defeat China alone.

That is a very dangerous and unstable situation. If it comes to pass, it will mean that China is basically free to start any conventional conflict it wants, without worrying that it will be ganged up on — because there will be no possible gang big enough to beat it. The only thing they’ll have to fear is nuclear weapons.

And of course other nations will know this in advance, so in any conflict that’s not absolutely existential, most of them will probably make the rational choice to give China whatever it wants without fighting.2 China wants to conquer Taiwan and claim the entire South China Sea? Fine, go ahead. China wants to take Arunachal Pradesh from India and Okinawa from Japan? All yours, sir. China wants to make Japan and Europe sign “unequal treaties” as revenge for the ones China was made to sign in the 19th century? Absolutely. China wants preferential access to the world’s minerals, fossil fuels, and food supplies? Go ahead. And so on.

China’s leaders know this very well, of course, which is why they are unleashing a massive and unprecedented amount of industrial policy spending — in the form of cheap bank loans, tax credits, and direct subsidies — to raise production in militarily useful manufacturing industries like autos, batteries, electronics, chemicals, ships, aircraft, drones, and foundational semiconductors. This doesn’t just raise Chinese production — it also creates a flood of overcapacity that spills out into global markets and forces American, European, Japanese, Korean, and Taiwanese companies out of the market.

By creating overcapacity, China is forcibly deindustrializing every single one of its geopolitical rivals. Yes, this reduces profit for Chinese companies, but profit is not the goal of war.

America’s most economically important allies — Germany and Japan — are bearing the brunt of China’s most recent industrial assault. In the 2000s and 2010s, Germany’s manufacturing exports boomed, as they sold China high-tech machinery and components. China has now copied, stole, or reinvented much of Germany’s technology, and are now squeezing out the Germany suppliers:

Source: Brad Setser

This is one reason — though not the only reason — why German industrial production has been collapsing since 2017:

Meanwhile, China has already taken away much of the electronics industry from Japan, and now a flood of cheap Chinese car exports is demolishing the vaunted Japanese auto industry in world markets:

Source: Bloomberg

The democratic countries have all struggled to respond to China’s industrial assault, because as capitalist countries, they naturally think about manufacturing mainly in terms of economic efficiency and profits unless a major war is actively in progress.

Democratic countries’ economies are mainly set up as free market economies with redistribution, because this is what maximizes living standards in peacetime. In a free market economy, if a foreign country wants to sell you cheap cars, you let them do it, and you allocate your own productive resources to something more profitable instead. If China is willing to sell you brand-new electric vehicles for $10,000, why should you turn them down? Just make B2B SaaS and advertising platforms and chat apps, sell them for a high profit margin, and drive a Chinese car.

Except then a war comes, and suddenly you find that B2B SaaS and advertising platforms and chat apps aren’t very useful for defending your freedoms. Oops! The right time to worry about manufacturing would have been years before the war, except you weren’t able to anticipate and prepare for the future. Manufacturing doesn’t just support war — in a very real way, it’s a war in and of itself.

Democratic countries seem to still mostly be in “peace mode” with respect to their economic models. They don’t yet see manufacturing as something that needs to be preserved and expanded in peacetime in order to be ready for the increasing likelihood of a major war. Fortunately, both Republicans and Democrats in America have inched away from this deadly complacency in recent years. But both the tariffs embraced by the GOP and the industrial policies pioneered by the Dems are only partial solutions, lacking key pieces of a military-industrial strategy.

Neither Republicans nor Democrats have a complete strategy for winning the manufacturing war

A military-industrial strategy for the U.S. and its allies to match China will need to involve three elements:

  1. Tariffs and other trade barriers against China, in order to prevent sudden floods of Chinese exports from forcibly deindustrializing other countries.

  2. Industrial policy, to maintain and extend manufacturing capacity in democratic nations.

  3. A large common market outside of China, so that non-Chinese manufacturers can gain economies of scale.

The GOP’s tariffs-first approach achieves the first of these, but actively sabotages the third by putting tariffs on allies. The Democrats’ industrial policy focused approach achieves the second, but hamstrings much of its own effort with regulation and contracting requirements.

First, let’s talk about the GOP, since Trump is about to come back into office. In his first term, Trump moved the U.S. away from the free trade consensus and from the model of “engagement” with China. He pioneered the use of both tariffs and export controls as economic weapons. In his second term, he’s almost certain to double down on tariffs.

This will help protect the remaining pieces of U.S. industry from being suddenly annihilated by a wave of subsidized Chinese imports — as happened to the U.S. solar panel industry in the 2010s. But Trump is making a number of mistakes that will severely limit the effectiveness of his tariffs.

First, he’s threatening broad tariffs on most or all Chinese goods, instead of tariffs targeted at specific, militarily useful goods. In a post two weeks ago, I explained why broad tariffs are of limited effectiveness:

Broad tariffs cause bigger exchange rate movements, which cancel out more of the effect of the tariffs. Putting tariffs on Chinese-made TVs, clothing, furniture, and laptops weakens the effect of tariffs on Chinese-made cars, chips, machinery, and batteries.

Second, Trump is threatening to put tariffs on U.S. allies like Canada and Mexico. This will deprive American manufacturers of the cheap parts and components they need to build things cheaply, thus making them less competitive against their Chinese rivals. It will also provoke retaliation from allies, limiting the markets available to American manufacturers.

As for industrial policy, Trump doesn’t seem to see the value in it. He has threatened to cancel the CHIPS Act, as well as the Inflation Reduction Act that subsidizes battery manufacturing. But tariffs cannot simply make chip and battery factories sprout from American soil like mushrooms after the rain. Tariffs protect the domestic market but do absolutely nothing to help American manufacturers in the far larger global market; only industrial policy can do that.

Democrats do support industrial policy. And in fact, Biden’s industrial policies have been one of the few small successes that any democratic nation has had in the struggle to keep up with China’s manufacturing juggernaut. A bonanza of factory construction is now taking place in the U.S.:

The construction is heavily concentrated in the industries Biden subsidized, even though almost all of the actual money being spent is private.

This is great, but the effort has been slowed by progressive policy priorities. Stubborn progressive defenses of NEPA and the American permitting regime have prevented major reform of that formidable stumbling block, while various onerous contracting requirements — the dreaded “everything bagel” — have held up construction timelines.

Even more fundamentally, progressives tend to see the point of industrial policy as providing jobs for factory workers, rather than in terms of national defense. This tends to make them complacent about delays and cost overruns, since these end up providing more jobs even as they prevent anything physical from actually getting built:

This is also why some progressives oppose automation in the manufacturing sector, on the grounds that it kills jobs. China, meanwhile, is racing ahead with automation, having recently zoomed ahead of both Japan and Germany in terms of the number of robots per worker, and leaving America in the dust:

Source: IFR

Meanwhile, although Democrats may become negatively polarized into opposing all tariffs (throwing the baby out with the bathwater), they still oppose measures like the TPP aimed at creating a common market capable of balancing China’s internal market.

In other words, neither political party in America has yet grasped the nature or the magnitude of the challenge posed by China’s manufacturing might, or the nature of the steps needed to respond. Trump is still dreaming the same simple protectionist dreams he thought of back in the 1990s, while his progressive opponents think of reindustrialization as a giant make-work program. Meanwhile, America’s allies overseas seem even less capable of averting their decline.

The manufacturing war is being lost, and we urgently need to turn things around.


Subscribe now

Share

1

Of course those munitions will have to be roughly equal in quality, but it’s pretty obvious that China is now technologically on par with other major nations in almost every area.

SpaceX launches 350th mission using a flight-proven Falcon 9 rocket booster during Starlink mission from California

File: A Falcon 9 rocket stands ready to launch a Starlink mission. Image: SpaceX

SpaceX aims notched another milestone in spaceflight reusability Wednesday night when it not only launched a flight-proven Falcon 9 rocket booster for the 350th time in program history, but also performed its 300th successful booster landing.

The Starlink 9-14 mission lifted off from Space Launch Complex 4 East (SLC-4E) at Vandenberg Space Force Base at 7:05 p.m. PST (10:05 p.m. EST, 0305 UTC). However, in announcing the mission on its website though, SpaceX just broadly said, “The four-hour launch window opens at 4:06 p.m. PT.”

For the third time in as many launches from California, SpaceX left the public in the dark as to whether or not the launch will be viewable via livestream. When it published details of the launch on its website Wednesday afternoon, it didn’t include a link to a webcast, nor did it mention the mission on social media.

By contrast, SpaceX simultaneously published a launch page for the planned launch of SiriusXM’s SXM-9 satellite, which will lift off on a Falcon 9 rocket from NASA’s Kennedy Space Center late Thursday morning. Not only did SpaceX include a link to the livestream for the SXM-9 mission, but it also posted to its X account announcing the launch. The Starlink 6-70 mission which launched from Cape Canaveral earlier Wednesday also had a webcast of liftoff that was announced in advance.

SpaceX did end up live streaming the two previous ascents from Vanenberg Space Force Base, NROL-126 and Starlink 9-13. However, in both cases, a live stream popped up well after the rockets had left the launch pad.

That ended up being the case as well with the Starlink 9-14 mission. SpaceX popped up its livestream about 44 seconds after the rocket left the launch pad. No explanation was given as to why SpaceX started its broadcast midstream for a third time.

SpaceX launches its Falcon 9 first stage booster, tail number B1081, on the Starlink 9-10 mission from Vandenberg Space Force Base on Nov. 9, 2024. This mission marked the 37th launch from the West Coast for SpaceX in 2024. Image: SpaceX

The Falcon 9 first stage booster for the Starlink 9-14 mission, with the tailnumber B1081, launched for a 12th time. It previously supported the launches of two missions to the International Space Station (Crew-7 and CRS-29), two climate-monitoring spacecraft (PACE and EarthCARE) and five previous Starlink missions.

A little more than eight minutes after liftoff, B1081 completed the 300th successful droneship landing when it touched down on the SpaceX droneship ‘Of Course I Still Love You,’ positioned in the Pacific Ocean. This was the 379th overall booster landing for SpaceX.

Onboard the mission are 20 Starlink V2 Mini satellites, including 13 that feature Direct to Cell capabilities. With this mission, SpaceX will have launched 349 DTC Starlink satellites since the first such launch on January 2.

In late November, SpaceX received approval from the U.S. Federal Communications Commission to begin rolling out cellular service alongside its domestic telecom partner, T-Mobile.

The FCC allowed SpaceX to use its previously authorized up to 7,500 second generation Starlink satellites using the V-band frequency from 340 km to 360 km.

“SpaceX is authorized to communicate with these satellites in the previously authorized Ku-, Ka-, E-, and V-band frequencies, in conformance with the technical specifications SpaceX has provided to the Commission, the conditions previously placed on its authorizations, and the conditions we adopt today,” the FCC wrote in a Nov. 26 filing.

“Authorization to permit SpaceX to operate up to 7,500 Gen2 satellites in lower altitude shells will enable SpaceX to begin providing lower-latency satellite service to support growing demand in rural and remote areas that lack terrestrial wireless service options.16 This partial grant also strikes the right balance between allowing SpaceX’s operations at lower altitudes to provide low-latency satellite service and permitting the Commission to continue to monitor SpaceX’s constellation and evaluate issues previously raised on the record.”

Links 12/4/24

Links for you. Science:

Teeny tardigrades can survive space and lethal radiation. Scientists may finally know how
Salamanders are surprisingly abundant in US northeastern forests, research finds
Untreated sewage and fertilizer runoff threaten the Florida manatee’s main food source, contributing to malnutrition
Long COVID facts and findings: a large-scale online survey in 74,075 Chinese participants
United States’ first known case of more severe strain of mpox confirmed in California
CDC data show sharp rise in rates of meningococcal disease

Other:

Harris’s advisers blame everything but themselves for their loss
History will judge Biden harshly on Gaza
A Political Technocrat Makes His Pitch for Saving the Democratic Party
A dangerous and unqualified choice for the FBI
Imagine actually opposing fascism
Act 10 Overturned
Chinese entrepreneur invested $30M in Trump’s crypto project after election
Godot Isn’t Making it
Report from inside the ‘deep state’: We’re not going anywhere. Career Justice Department lawyers say they intend to outlast second Trump administration.
We’re Entering a World Where the Rule of Law Is Turned Inside Out
How a D.C. ‘Slumlord’ Scammed Tenants and Lenders to Build a Portfolio of Neglected Properties, According to Lawsuits and Tenant Accounts
Pete Hegseth’s mother begged him to “get some help” — he joined a misogynist church instead
6 hours under martial law in Seoul
RFK Jr. was paid six figures by his vaccine-challenging group before presidential run
South Koreans just squashed a fascist coup in 6 hours. We could learn a thing or two about how to resist fascism from the people who did it efficiently in the middle of the night.
UnitedHealthcare CEO fatally shot in Midtown: sources (“The shooting is being investigated as a possibly targeted hit, sources said.”)
UnitedHealthcare CEO Brian Thompson killed by masked gunman outside Midtown Hilton hotel (seems the shooter was masked and might have been using a silencer)
It was always rough to be a woman on Twitter. It’s even worse on X. Just ask the one who posted about her PhD.
Detroit Mayor Duggan, a longtime Democrat, will run for Michigan governor in 2026 as independent
From W. To Donald
Jaguar Type 00 Concept Is A Pink Panther Acid Trip Of An Electric Coupe With A Stone And Brass Interior. Two examples of the Type 00 shooting brake were unveiled at Miami Art Week, one pink and one blue, and I freakin’ love them
Why it’s about to get more expensive to park along DC’s popular U Street corridor (no more single price for on-street parking)
Andy Grove Was Right
Republicans are going to wind up regretting Trump’s deportation scheme
Trump’s DEA pick Chad Chronister withdraws from consideration: Florida sheriff cites ‘gravity’ of responsibility days after Trump taps him to lead Drug Enforcement Administration (lol. It’s that he would be crucified by Republicans for enforcing mask wearing during the COVID pandemic and also that he doesn’t think it’s DEA’s job to enforce immigration laws)

On Social Security Cuts, the Democratic Answer Must Be “Nope”

Maybe even hell nope. Fascists wanting to cut the limited safety net? Inconceivable!

Rep. Richard McCormick (R-GA) told Fox Business that Republicans will desperately need help from Democrats if they intend to cut Social Security and Medicare — both of which are programs Democrats have historically supported.

President-elect Donald Trump has said he intends to extend, and possibly expand, his tax cuts to the wealthy and corporations. The cost of the cuts has been in the trillions, and expanding them would cost the country a little under $5 trillion, experts have calculated.

And, with expensive program being pursued such as Trump’s promised mass deportations, more money will need to be found from somewhere…

“We’re gonna have to have some hard decisions,” claimed McCormick during his Fox appearance Tuesday. “We’re gonna have to bring in the Democrats to talk about Social Security, Medicaid, Medicare.

“There’s hundreds of billions of dollars to be saved, we just have to have the stomach to take those challenges on.”

Trump promised during the 2024 campaign that he would not make any cuts to Social Security or Medicare, but CNN reported in October that some of his proposals would ultimately slash Social Security in the next six years.

The only answer from Democrats should be “Nope.” And with Manic Pixie Dream Senator Sinema and Co-Chair of All Senate Committees Joe Fucking Manchin gone, hopefully even Congressional Democrats won’t fuck this up. If Republicans want cuts, let Republicans own them. Force Republicans eat this shit sandwich of their own making.

It’s time for Democrats to act like an opposition party, their inept consultants–the ones who won’t even own up to losing the election–notwithstanding.

Andy Grove in 2000: ‘What I’ve Learned’

A few nuggets of wisdom from Andy Grove, in an interview with Esquire after he retired as Intel’s CEO, but still served as chairman:

Profits are the lifeblood of enterprise. Don’t let anyone tell you different.

You must understand your mistakes. Study the hell out of them. You’re not going to have the chance of making the same mistake again — you can’t step into the river again at the same place and the same time — but you will have the chance of making a similar mistake.

Status is a very dangerous thing. I’ve met too many people who make it a point of pride that they never take money out of a cash machine, people who are too good to have their own e-mail address, because that’s for everybody else but not them. It’s hard to fight the temptation to set yourself apart from the rest of the world.

Grove, still serving as CEO during Intel’s zenith in 1997, didn’t even have an office. He worked out of an 8x9-foot cubicle.

What you’re seeing today is a very, very rapid evolution of an industry where the milieu is better understood by people who grew up in the same time frame as the industry. A lot of the years that many of us have spent in business before this time are of only limited relevance.

This industry is not like any other. Computers don’t get incrementally more powerful; they get exponentially more powerful.

 ★ 

[Sponsor] 1Password: You Want to Charge How Much for SSO?

Imagine if you went to the movies and they charged $8000 for popcorn.

Or, imagine you got on a plane and they told you that seatbelts were only available in first class.

Your sense of outraged injustice would probably be something like what IT and security professionals feel when a software vendor hits them with the dreaded SSO tax.

The SSO tax is the name given to the practice of charging an outrageous premium for Single Sign-On, often by making it part of a product’s “enterprise tier.” The jump in price can be astonishing — one CRM charges over 5000% more for the tier with SSO. At those prices, only very large companies can afford to pay for SSO. But the problem is that companies of all sizes need it.

In a world where compromised credentials are the number one culprit in breaches, SSO reduces the number of weak, reused passwords flying around. It’s also critical to onboarding and offboarding, since IT only has to manage a single on/off switch, instead of managing access separately for every application.

To be fair, there’s nothing wrong with charging some extra for SSO — it’s not free for vendors to build or maintain — but putting it out of the reach of so many companies is irresponsible, and makes us all less safe.

Still, until outraged customers can shame vendors into getting rid of the tax, many businesses have to figure out how to live without SSO. For them, the best route is likely to be a password manager, which also reduces weak and re-used credentials, and enables secure sharing across teams. And a password manager is likely a good investment anyway, for the apps that aren’t integrated with SSO.

To learn more about the past, present, and future of the SSO tax, read the full blog post.

 ★ 

★ Andy Grove Was Right

The Verge’s Sean Hollister penned an excellent high-level summary of Pat Gelsinger’s ignominious ouster from Intel, under the headline “What Happened to Intel?” A wee bit of pussyfooting here, though, caught my eye:

Just how bad was it before Gelsinger took the top job?

Not great! There were bad bets, multiple generations of delayed chips, quality assurance issues, and then Apple decided to abandon Intel in favor of its homegrown Arm-based chips — which turned out to be good, seriously showing up Intel in the laptop performance and battery life realms. We wrote all about it in “The summer Intel fell behind.”

Intel had earlier misses, too: the company long regretted its decision not to put Intel inside the iPhone, and it failed to execute on phone chips for Android handsets as well. It arguably missed the boat on the entire mobile revolution.

There’s no argument about it. Intel completely missed mobile. iPhones never used Intel chips and Apple Silicon chips are all fabbed by TSMC. Apple’s chips are the best in the industry, also without argument, and the only mobile chips that can be seen as reasonable competition are from Qualcomm (and maybe Samsung). Intel has never been a player in that game, and it’s a game Intel needed not only to be a player in, but to dominate.

It’s not just that smartphones are now a bigger industry than the PC industry ever was, and that Intel has missed out on becoming a dominant supplier to phone makers. That’s bad, but it’s not the worst of it. It’s that those ARM-based mobile chips — Apple Silicon and Qualcomm’s Snapdragon lineup — got so good that they’re now taking over large swaths of the high end of the PC market. Partly from an obsessive focus on performance-per-watt efficiency, partly from the inherent advantages of ARM’s architecture, partly from engineering talent and strategy, and partly from the profound benefits of economies of scale as the mobile market exploded. Apple, as we all know, moved the entire Mac platform from Intel chips to Apple Silicon starting in 2020. The Mac “only” has 15 percent of the worldwide PC market, but the entirety of the Mac’s market share is at the premium end of the market. Losing the Mac was a huge loss for Intel. And now Qualcomm and Microsoft are pushing Windows laptops to ARM chips too, for the same reasons: not just performance-per-watt, but sheer performance. x86 CPUs are still dominant on gaming PCs, but even there, AMD is considered the cream of the crop.

Of all companies, Intel should have seen the potential for this to happen. Intel did not take “phone chips” seriously, but within a decade, those ostensibly toy “phone chips” were the best CPUs in the world for premium PC laptops, and their efficiency advantages make them advantageous in data centers too. And Apple has shown that they’re even superior for workstation-class desktops. That’s exactly how Intel became Intel back at the outset of the personal computing revolution. PCs were seen as mere toys by the “real” computer makers of the 1970s and early 1980s. IBM was caught so flatfooted that when they saw the need to enter the PC market, they went to Intel for the chips and Microsoft for DOS — decisions that both Intel and Microsoft capitalized upon, resulting in a tag-team hardware/software dominance of the entire computing industry that lasted a full quarter century, while IBM was left sidelined as just another maker of PCs. From Intel’s perspective, the x86 platform went from being a “toy” to being the dominant architecture for everything from cheap laptops all the way up to data-center-class servers.

ARM-based “phone chips” did the same thing to x86 that Intel’s x86 “PC chips” had done, decades earlier, to mainframes. Likewise, Nvidia turned “graphics cards for video game enthusiasts” — also once considered mere toys — into what is now, depending on stock market fluctuations, the most valuable company in the world. They’re neck and neck with the other company that pantsed Intel for silicon design leadership: Apple. Creating “the world’s best chips” remains an incredible, almost unfathomably profitable place to be as a business. Apple and Nvidia can both say that about the very different segments of the market in which their chips dominate. Intel can’t say that today about any of the segments for which it produces chips. TSMC, the company that fabs all chips for Apple Silicon and most of Nvidia’s leading chips, is 9th on the list of companies ranked by market cap, with a spot in the top 10 that Intel used to occupy. Today, Intel is 180th — and on a trajectory to fall out of the top 200.

Intel never should have been blithe to the threat. The company’s longtime CEO and chairman (and employee #3) Andy Grove titled his autobiography Only the Paranoid Survive. The full passage from which he drew the title:

Business success contains the seeds of its own destruction. Success breeds complacency. Complacency breeds failure. Only the paranoid survive.

Grove retired as CEO in 1998 and as chairman in 2005. It’s as though no one at Intel after him listened to a word he said. Grove’s words don’t read merely as advice — they read today as a postmortem synopsis for Intel’s own precipitous decline over the last 20 years.

Wired for the GOP

This column, by Alexander Burns, the head of news at Politico, is a rich example of the DC logic that only Democrats have agency and it’s only to Democrats that standards, norms, rules or whatever else apply. “Joe Biden’s Parting Insult: The president delivered a vote of no confidence in a justice system preparing for siege.”

It is a rich gift to those who want to blow up the justice system as we know it, and who claim the government is a self-dealing club for hypocritical elites. It is a promise-breaking act that subjects Biden’s allies to yet another humiliation in a year packed with Biden-inflicted injuries.

Republicans are like the weather. Destructive and unpredictable, perhaps capricious and sometimes dangerous. But who shouts at the rain? Those are the deeply carved grooves into which our elite media narratives all turn. How else do you explain the vastly bigger press uproar over Biden’s pardon than a notorious charlatan who’s promised to abuse his power at every opportunity being on a fast track to take over federal law enforcement?

Before getting to my main point, let me address a subsidiary one. I get people who don’t think Biden should have done this. I disagree. But that’s a respectable opinion. In the abstract, at least, it’s not best practices … until you look at the details. What I don’t get, what I think is as close as you get to being objectively wrong, is a different but seemingly common criticism. It’s the people who say that it’s one thing that Biden pardoned his son for his two ongoing criminal cases, but that it was a step too far to issue a blanket pardon going back a decade covering more or less anything that happened during that period.

As I said, if you don’t think Biden should have issued this pardon, fine. But if he did, he certainly had to and was unquestionably right to issue it as a blanket pardon. The Trump administration absolutely would have found something else to charge Hunter Biden with as payback for the pardon. They may have done so without Biden issuing the pardon at all. Trump and his top lieutenants, including his FBI director nominee, have all said they come into office wanting payback. It would be insane to issue a limited, specific pardon and then leave his son at the mercy of the Trump DOJ. This logic is so obvious it all but amounts to a mathematical proof.

Kash Patel himself repeatedly promised to launch new investigations and bring additional charges against Hunter Biden if Trump won the election. The two stories connect. They’re one story.

This brings me back to a more general point. Democrats need to organize their future politics around the simple reality that the establishment media is structural hostile to the Democratic Party. This doesn’t mean every journalist individually, of course. But the establishment media generally — the Times, the Post, the Journal, CNN and the business news channels, all of them, and for the reasons we’ve discussed countless times. Democrats should do this not simply because it’s true but because it relieves them of the embarrassment of imagining otherwise, of speaking up for or passively allowing themselves to be identified with institutions in an age of distrust and anti-institutionalism. The status quo, the paradoxical identification, puts Democrats in the position of a jilted spouse, perpetually discomfited, let down. It’s not only damaging directly. It signals an enervating weakness. It’s time to move on. Have some some self-respect. And act accordingly.

Post-Thanksgiving Podcast Scheduling Update

This week’s episode of The Josh Marshall Podcast will be out Thursday instead of our usual Wednesday. Bear with us as we get back to our regular schedule post-turkey day! In the meantime, the latest video episode of the show is live on our YouTube page.

U.S. Officials Urge Americans to Use Encrypted Apps, for Texting and Calls, in Wake of Chinese Infiltration of Our Unencrypted Telecom Network

Kevin Collier, reporting for NBC News:

Amid an unprecedented cyberattack on telecommunications companies such as AT&T and Verizon, U.S. officials have recommended that Americans use encrypted messaging apps to ensure their communications stay hidden from foreign hackers.

The hacking campaign, nicknamed Salt Typhoon by Microsoft, is one of the largest intelligence compromises in U.S. history, and it has not yet been fully remediated. Officials on a news call Tuesday refused to set a timetable for declaring the country’s telecommunications systems free of interlopers. Officials had told NBC News that China hacked AT&T, Verizon and Lumen Technologies to spy on customers.

A spokesperson for the Chinese Embassy in Washington did not immediately respond to a request for comment.

Don’t hold your breath.

In the call Tuesday, two officials — a senior FBI official who asked not to be named and Jeff Greene, executive assistant director for cybersecurity at the Cybersecurity and Infrastructure Security Agency — both recommended using encrypted messaging apps to Americans who want to minimize the chances of China’s intercepting their communications.

“Our suggestion, what we have told folks internally, is not new here: Encryption is your friend, whether it’s on text messaging or if you have the capacity to use encrypted voice communication. Even if the adversary is able to intercept the data, if it is encrypted, it will make it impossible,” Greene said.

It seems kind of new for the FBI to call encryption “our friend”, but now that I think about it, their beef over the years has primarily been about gaining access to locked devices, not eavesdropping on communication protocols. Their advocacy stance on device encryption has not changed — they still want a “back door for good guys” there. Their thinking, I think, is that E2EE communications are a good thing because they protect against remote eavesdropping from foreign adversaries — exactly like this campaign waged by China. The FBI doesn’t need to intercept communications over the wire. When the FBI wants to see someone’s communications, they get a warrant to seize their devices. That’s why the FBI wants device back doors, but are now encouraging the use of protocols that are truly E2EE. But that’s not to say that law enforcement agencies worldwide don’t still fantasize about mandatory “back doors for good guys”.

Here’s a clunker of a paragraph from this NBC News story, though:

Privacy advocates have long advocated using end-to-end encrypted apps. Signal and WhatsApp automatically implement end-to-end encryption in both calls and messages. Google Messages and iMessage also can encrypt calls and texts end to end.

It’s true that both voice and text communications over Signal and WhatsApp are always secured with end-to-end encryption. But Google Messages is an Android app that only handles text messaging via SMS and RCS, not voice. There’s a “Call” button in Google Messages but that just dials the contact using the Phone app — just a plain old-fashioned unencrypted phone call. (There’s a Video Call button in Google Messages, but that button tries to launch Google Meet.) Some text chats in Google Messages are encrypted, but only those using RCS in which all participants are using a recent version of Google Messages. Google Messages does provide visual indicators of the encryption status of a chat. The RCS standard has no encryption; E2EE RCS chats in Google Messages use Google’s proprietary extension and are exclusive to the Google Messages app, so RCS chats between Google Messages and other apps, most conspicuously Apple Messages, are not encrypted.

iMessage is not an app. It is Apple’s proprietary protocol, available within its Messages app. The entire iMessage protocol was built upon end-to-end encryption — all iMessage messages have been E2EE from the start. Apple also offers FaceTime for voice and video calls, and FaceTime calls are always secured by E2EE.

 ★ 

Trump nominates Jared Isaacman to serve as next NASA administrator

Jared Isaacman, a billionaire entrepreneur with two commercial spaceflights to his credit and strong ties to Elon Musk and SpaceX, has been nominated by President-elect Donald Trump to serve as NASA’s next administrator. Image: John Kraus/Inspiration4

Billionaire entrepreneur Jared Isaacman, a veteran private astronaut with strong ties to Elon Musk and his rocket company SpaceX, has been nominated by the incoming Trump administration to serve as NASA’s next administrator, the president-elect said in a statement Wednesday.

If confirmed, Isaacman, 41, would be the fifth NASA administrator with spaceflight experience, replacing former Democratic Sen. Bill Nelson, who flew into orbit aboard the space shuttle Columbia in early 1986.

“I am delighted to nominate Jared Isaacman, an accomplished business leader, philanthropist, pilot and astronaut, as administrator of the National Aeronautics and Space Administration,” Trump said in a statement on his social media platform, Truth Social.

“Jared’s passion for space, astronaut experience and dedication to pushing the boundaries of exploration, unlocking the mysteries of the universe and advancing the new space economy, make him ideally suited to lead NASA into a bold new era.”

Isaacman founded a payment processing company, later named Shift4 Payments, while a high school student. He chartered the first purely commercial, all-civilian American “space tourist” mission — Inspiration4 — in September 2021, paying SpaceX an undisclosed amount to launch him and three other civilians on a two-day 23-hour flight.

He flew to space again this past September, commanding the first of three planned SpaceX “Polaris” missions, logging nearly five days in space on a flight that took the crew farther from Earth than any astronauts since the Apollo moon program. Isaacman also became the first private citizen to carry out a spacewalk.

He is scheduled to lead another Polaris mission aboard a SpaceX capsule before leading the first crew to space aboard the California rocket builder’s gargantuan Super Heavy-Starship rocket.

Dates have not been announced for either of those missions and it’s not yet clear what impact the nomination to lead NASA will have on those flights, whether Isaacman still intends to fly aboard one or both or what sort of influence his friendship with Musk might have on NASA’s future direction.

Jared Isaacman and his wife, Monica, pose in front of the entrepreneur’s MiG-29 fighter jet. Image: John Kraus/Polaris Program

But Isaacman, a skilled pilot who flies his own MiG-29 fighter jet, made it clear in a statement following Trump’s announcement that NASA can expect him to be a vocal space advocate who will help “usher in an era where humanity becomes a true spacefaring civilization.”

“With the support of President Trump, I can promise you this: We will never again lose our ability to journey to the stars and never settle for second place,” Isaacman said. “We will inspire children, yours and mine, to look up and dream of what is possible. Americans will walk on the moon and Mars and in doing so, we will make life better here on Earth.”

He said it would be the “honor of a lifetime to serve in this role and to work alongside NASA’s extraordinary team to realize our shared dreams of exploration and discovery.”

Jim Bridenstine, Nelson’s predecessor at the helm of NASA, helped kickstart the agency’s Artemis moon program during the first Trump administration.

He said in a statement Wednesday that “Jared’s vision for pushing boundaries, paired with his proven track record of success in private industry, positions him as an ideal candidate to lead NASA into a bold new era of exploration and discovery. I urge the Senate to swiftly confirm him.”

The Isaacman nomination comes as NASA struggles to keep the Artemis program on track amid tight budgets and what the agency’s own inspector general calls the “unsustainable” costs of NASA’s Space Launch System — SLS — moon rocket.

SpaceX supporters argue the company’s more powerful, fully-reusable Super Heavy Starship rocket is the obvious choice for deep space exploration, but the huge rocket is far from operational with just a half-dozen sub-orbital test flights to its credit.

Ship 31 and Super Heavy Booster 13 climb away from Starbase on the sixth test flight of SpaceX’s Starship vehicle. Image: Chuck Briggs/Spaceflight Now.

The SLS, on the other hand, is considered operational in that it completed an initial test flight in November 2022, sending an unpiloted Orion capsule around the moon and back. But unlike the Super Heavy-Starship, the SLS is a throw-away, expendable booster expected to cost more than $2 billion each through the first several flights.

NASA is currently gearing up to launch three NASA astronauts and a Canadian flier on an SLS-Orion mission — Artemis 2 — late next year. But problems with the first Orion’s heat shield and other issues threaten to push the Artemis 2 flight into 2026, years past initial expectations.

It’s not yet clear when the Artemis 3 mission, the first to carry astronauts to a landing near the moon’s south pole, might be feasible.

But that flight will feature a lunar lander built by SpaceX, a variant of the company’s Starship upper stage. SpaceX has carried out six test flights of the fully reusable Super Heavy-Starship to date, but has not yet put the Starship upper stage into orbit or brought it down for an intact landing.

Given the company’s rapid-fire test schedule, most observers believe SpaceX will get the Super Heavy-Starship working as planned in the near future, but scores of test flights will be required to demonstrate the safety and reliability required by NASA to put astronauts aboard.

And the moon mission poses unique challenges.

For the initial landing mission, multiple Super Heavy “tanker” flights will be required to refuel the moon lander in low-Earth orbit before it can be sent to the moon. Once in lunar orbit, it will await the arrival of the Artemis 3 crew, launched aboard an Orion capsule by an SLS rocket.

After boarding the Starship lander, two astronauts would descend to the surface, carry out the mission’s planned exploration, and then head back up to the orbiting Orion for the trip back to Earth.

China markets in everything

This is very interesting, and I think a world first: a local government in China has just sold its sky, literally. This is the government of Pingyin County, Jinan, Shandong Province who sold for 924 million yuan (approximately $130 million) a 30-year concession to operate and maintain its low-altitude economic projects to a company called Shandong Jinyu General Aviation Co., Ltd. The “low-altitude economy” is a big trend in China at the moment. XPeng, one of China’s leading EV manufacturers, recently released a low-altitude flying car for instance. Drone deliveries are becoming increasingly common in Chinese cities, and various regions are actively developing low-altitude transportation networks. Shanghai, for instance, plans to establish 400 low-altitude flight routes by 2027. But this is the first time a local government has monetized its low-altitude airspace…

Here is more from Arnaud Bertrand.  Via Jesper.

The post China markets in everything appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Heavy Truck Sales Increased in 4% YoY in November

This graph shows heavy truck sales since 1967 using data from the BEA. The dashed line is the November 2024 seasonally adjusted annual sales rate (SAAR) of 507 thousand.

Heavy truck sales really collapsed during the great recession, falling to a low of 180 thousand SAAR in May 2009.  Then heavy truck sales increased to a new record high of 570 thousand SAAR in April 2019.

Heavy Truck Sales Click on graph for larger image.

Note: "Heavy trucks - trucks more than 14,000 pounds gross vehicle weight."

Heavy truck sales declined sharply at the beginning of the pandemic, falling to a low of 288 thousand SAAR in May 2020.  

Heavy truck sales were at 507 thousand SAAR in November, up from a revised 463 thousand in October, and up 3.7% from 489 thousand SAAR in November 2023.  

Usually, heavy truck sales decline sharply prior to a recession.  Sales were solid in November, and sales for October were revised up significantly.

As I mentioned yesterday, light vehicle sales increased in November.

Vehicle SalesThe second graph shows light vehicle sales since the BEA started keeping data in 1967.  Vehicle sales were at 16.50 million SAAR in November, up from 16.25 million in October, and up 6.7% from 15.46 million in November 2023.

Fed's Beige Book: "Economic activity rose slightly"

Fed's Beige Book
Economic activity rose slightly in most Districts. Three regions exhibited modest or moderate growth that offset flat or slightly declining activity in two others. Though growth in economic activity was generally small, expectations for growth rose moderately across most geographies and sectors. Business contacts expressed optimism that demand will rise in coming months. Consumer spending was generally stable. Many consumer-oriented businesses across Districts noted further increases in price sensitivity among consumers, as well as several reports of increased sensitivity to quality. Spending on home furnishings was down, which contacts attributed to limited household mobility. Demand for mortgages was low overall, though reports on recent changes in home loan demand were mixed due to volatility in rates. Commercial real estate lending was similarly subdued. Still, contacts generally reported financing remained available. Capital spending and purchases of raw materials were flat or declining in most Districts. Sales of farm equipment were a notable headwind to overall investment activity, and several contacts expressed concerns about the future prices of equipment given ongoing weakness in the farm economy. Energy activity in the oil and gas sector was flat but demand for electricity generation continued to grow at a robust rate. The rise in electricity demand was driven by rapid expansions in data centers and was reportedly planned to be met by investments in renewable generation capacity in coming years.

Labor Markets

Employment levels were flat or up only slightly across Districts. Hiring activity was subdued as worker turnover remained low and few firms reported increasing their headcount. The level of layoffs was also reportedly low. Contacts indicated they expected employment to remain steady or rise slightly over the next year, but many were cautious in their optimism about any pickup in hiring activity.
...
Prices

Prices rose only at a modest pace across Federal Reserve Districts. Both consumer-oriented and business-oriented contacts reported greater difficulty passing costs on to customers.
emphasis added

Wednesday assorted links

1. Acemoglu on liberalism.

2. These names are not amongst the nine billion names of God.

3. Russ Roberts podcast with the translator of Life and Fate.

4. The Baby Money Index.  Qatar is number three.

5. Lessons from working in an art gallery, excellent post recommended.

6. Korean martial law and trading opportunities.

The post Wednesday assorted links appeared first on Marginal REVOLUTION.

       

Comments

 

‘Quickly dismantle America’: Russian State TV Hosts ‘thrilled’ About These 2 Trump Picks

Excitement About the Destabilization of the United States

Russian state-owned broadcast channels have been bullish on President-elect Donald Trump’s Cabinet. But hosts on one channel are particularly enthusiastic about two appointees. And they’re specifically excited because they believe the Cabinet will quickly bring about the destabilization of the United States.

In a segment posted to YouTube by Russian Media Monitor (a channel created by Daily Beast columnist Julia Davis) Russia-1 anchor Vladimir Solovyov recently heaped praise on Kash Patel, who Trump has nominated to be the next FBI director. Solovyov said that he “really really like[s]” most of Trump’s nominees, though he lamented that the Senate “will not let them in.” Davis noted that Solovyov and the rest of the panel were “thrilled” about the incoming administration given his Cabinet appointees.

“And the Lord destroyed Sodom and Gomorrah,” Solovyov said. “What an excellent team is coming along with Trump! Not with respect to Ukraine, but as far as everything else goes, if they are allowed to get in, they will quickly dismantle America, brick by brick.”

“Trump’s nominee to head the FBI, Kash Patel, is simply on fire,” Solvyov continued, before playing a clip of Patel describing how he would shut down the J. Edgar Hoover building in Washington, D.C. and turn it into a “museum of the Deep State” while scattering its 7,000 employees across the U.S.

“He’s a beaut! He is very, very good!” Solovyov added.

Another panelist — professor Andrey Sidorov, who is the Dean of the School of World Politics at Moscow State University — was complimentary of both Patel and Secretary of Defense-designate Pete Hegseth, saying that the latter was in the same vein as Patel. Sidorov said he was “fully in support” of Patel leading the FBI, and exclaimed that “another one like him will head the Defense Department.”

Aside from Patel and Hegseth, other Trump Cabinet picks have also received high marks from Russian state media hosts. Director of National Intelligence-designate Tulsi Gabbard has been praised for her friendliness to Russian President Vladimir Putin and Syrian President Bashar al-Assad. Keith Kellogg, who Trump picked to be special envoy for Ukraine, reportedly got a “lukewarm reaction” from Moscow.

This article originally appeared on AlterNet.org and is republished with permission.


CLICK HERE TO DONATE IN SUPPORT OF DCREPORT’S NONPROFIT NEWS GATHERING, REPORTING AND OPINION

The post ‘Quickly dismantle America’: Russian State TV Hosts ‘thrilled’ About These 2 Trump Picks appeared first on DCReport.org.

Inflation Adjusted House Prices 1.4% Below 2022 Peak; Price-to-rent index is 8.1% below 2022 peak

Today, in the Calculated Risk Real Estate Newsletter: Inflation Adjusted House Prices 1.4% Below 2022 Peak

Excerpt:
It has been over 18 years since the bubble peak. In the September Case-Shiller house price index released last week, the seasonally adjusted National Index (SA), was reported as being 75% above the bubble peak in 2006. However, in real terms, the National index (SA) is about 11% above the bubble peak (and historically there has been an upward slope to real house prices).  The composite 20, in real terms, is 3% above the bubble peak.

People usually graph nominal house prices, but it is also important to look at prices in real terms.  As an example, if a house price was $300,000 in January 2010, the price would be $434,000 today adjusted for inflation (45% increase).  That is why the second graph below is important - this shows "real" prices.

The third graph shows the price-to-rent ratio, and the fourth graph is the affordability index. The last graph shows the 5-year real return based on the Case-Shiller National Index.
...
Real House PricesThe second graph shows the same two indexes in real terms (adjusted for inflation using CPI).

In real terms (using CPI), the National index is 1.4% below the recent peak, and the Composite 20 index is 1.6% below the recent peak in 2022. The real National index increased in September, however, the Composite 20 index decreased slightly in real terms.

It has now been 28 months since the real peak in house prices. Typically, after a sharp increase in prices, it takes a number of years for real prices to reach new highs (see House Prices: 7 Years in Purgatory)

ISM® Services Index Decreases to 52.1% in November

(Posted with permission). The ISM® Services index was at 52.1%, down from 56.0% last month. The employment index decreased to 51.5%, from 53.0%. Note: Above 50 indicates expansion, below 50 in contraction.

From the Institute for Supply Management: Services PMI® at 52.1% November 2024 Services ISM® Report On Business®
conomic activity in the services sector expanded for the fifth consecutive month in November, say the nation's purchasing and supply executives in the latest Services ISM® Report On Business®. The Services PMI® registered 52.1 percent, indicating expansion for the 51st time in 54 months since recovery from the coronavirus pandemic-induced recession began in June 2020.

The report was issued today by Steve Miller, CPSM, CSCP, Chair of the Institute for Supply Management® (ISM®) Services Business Survey Committee: “In November, the Services PMI® registered 52.1 percent, 3.9 percentage points lower than October’s figure of 56 percent. The reading in November marked the ninth time the composite index has been in expansion territory this year. The Business Activity Index registered 53.7 percent in November, 3.5 percentage points lower than the 57.2 percent recorded in October, indicating a fifth month of expansion after a contraction in June. The New Orders Index also recorded a reading of 53.7 percent in November, 3.7 percentage points lower than October’s figure of 57.4 percent. The Employment Index landed in expansion territory for the fourth time in five months; the reading of 51.5 percent is a 1.5-percentage point decrease compared to the 53 percent recorded in October.
emphasis added
The PMI was below expectations.

Why You Should Talk to People You Disagree With About Politics

The Conversation logo

It’s Not About Changing Minds, but Understanding Each Other

If you dared to talk politics with family or friends over Thanksgiving, you might not have changed each other’s minds. But don’t be discouraged – and consider talking with them again as the holiday season continues.

As a scholar of political dialogue, for the past decade I have been studying conversations between people who disagree about politics. What I have found is that people rarely change their minds about political issues as a direct result of these discussions. But they frequently feel much better about the people with whom they disagree.

But it’s important how those conversations go. Confrontations and arguments are not as productive as inquiry and honest curiosity.

Conversations That Make a Difference

When people sense that others are sincerely curious about what they think, asking calmly posed, respectful questions, they tend to drop their defenses. Instead of being argumentative in response to an aggressive question, they try to mirror the sincerity they perceive.

In addition to asking why someone voted as they did, you might ask about what they fear and what they hope for, what they believe creates a good society, and, importantly, about the personal experiences that have given rise to these fears, hopes and beliefs.

This curiosity-based approach has important effects on both the listener and the speaker. I have found that the listener may come to understand how the speaker could make a choice that the listener considers to be a bad one yet still think of the speaker as a decent person. The speaker becomes more relatable, and often their intentions are revealed to be well-meaning – or even ethically sound. A listener can begin to see how, given different circumstances or different ethical convictions, that person’s vote could make sense.

The speaker, too, stands to have a positive experience.

When I followed up with college students years after they participated in a dialogue session modeling curiosity-based listening, what they remembered best was their conversation partner. Students remembered that a peer they expected to attack them instead asked sincere, respectful questions and listened intently to the answers. They remembered feeling good in the person’s presence and liking them for it.

Three people sitting in a cozy living room, engaging in a thoughtful conversation.
Respectful political discussions can help bridge divides and foster mutual understanding, even when opinions differ. Photo by Antoni Shkraba via Pexels

Benefits to Democracy

This type of exchange between Americans of different political stripes can provide several important benefits to democracy.

First, these conversations can help ward off the worst dangers springing from hatred and fear. I expect that gaining some understanding of others’ reasons for their vote, as well as seeing their decency, may reduce people’s support for those conspiracy theories about election results that are based on the assumption that nobody could actually endorse the opposing candidate. Such understanding could also reduce support for policies that dehumanize and disenfranchise the other side and politicians who incite violence. In short, I believe these conversations can reduce the sense that the other side is so evil or stupid that it must be stopped at any cost.

Second, these conversations can help promote the best of what democracy promises. In an ideal democracy, people do not only fight for their own freedoms but also seek to understand their fellow citizens’ concerns.
People cannot create a society that supports everyone flourishing without knowing what others’ lives are like and without understanding the experiences, interests and convictions that drive them.

Finally, in the rare cases that people do change their minds about politics, I have found that it is not because they were argued into a different point of view. Instead, when someone is asked sincere, reflective questions, they sometimes begin to ask themselves those questions. And sometimes, over the years, they find their way into different answers.

For example, one college student told me in a follow-up interview years after she attended a dialogue session that she had been asked, “If you say you believe this, then why did you vote like that?

“It wasn’t an attacking question,” she recalled. “They really wanted to know.”

As a result, she confided, “I have been asking myself that question ever since.”

Two hands of different skin tones making a pinky promise, representing unity, trust, and mutual respect.
A pinky promise symbolizes trust, understanding, and connection, even across differences. Photo by Womanizer Toys via Pexels.

A Shared Connection

Dialogue alone does not sustain a healthy democracy. Citizen actions, not words, protect democratic institutions, our own rights and the rights of others.

But open, curious conversations among people who disagree keep alive the ideas and practices that remind us that we are all humans together, sharing a world – and in the U.S., sharing a nation that’s worth protecting.

This holiday season, let’s all commit to continuing to engage with the people with whom we most sharply disagree, with respect and dignity.The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.


CLICK HERE TO DONATE IN SUPPORT OF OUR NONPROFIT COVERAGE OF POLITICS

The post Why You Should Talk to People You Disagree With About Politics appeared first on DCReport.org.

ADP: Private Employment Increased 146,000 in November

From ADP: ADP National Employment Report: Private Sector Employment Increased by 146,000 Jobs in November; Annual Pay was Up 4.8%
Private sector employment increased by 146,000 jobs in November and annual pay was up 4.8 percent year-over-year, according to the November ADP® National Employment ReportTM produced by ADP Research in collaboration with the Stanford Digital Economy Lab (“Stanford Lab”). ...

“While overall growth for the month was healthy, industry performance was mixed,” said Nela Richardson, chief economist, ADP. “Manufacturing was the weakest we've seen since spring. Financial services and leisure and hospitality were also soft.”
emphasis added
This was below the consensus forecast of 166,000. The BLS report will be released Friday, and the consensus is for 183,000 non-farm payroll jobs added in October.

MBA: Mortgage Applications Increased in Weekly Survey

From the MBA: Mortgage Applications Increase in Latest MBA Weekly Survey
Mortgage applications increased 2.8 percent from one week earlier, according to data from the Mortgage Bankers Association’s (MBA) Weekly Mortgage Applications Survey for the week ending November 29, 2024. This week’s results include an adjustment for the Thanksgiving holiday.

The Market Composite Index, a measure of mortgage loan application volume, increased 2.8 percent on a seasonally adjusted basis from one week earlier. On an unadjusted basis, the Index decreased 30 percent compared with the previous week. The Refinance Index decreased 1 percent from the previous week and was 7 percent lower than the same week one year ago. The seasonally adjusted Purchase Index increased 6 percent from one week earlier. The unadjusted Purchase Index decreased 30 percent compared with the previous week and was 21 percent lower than the same week one year ago.

“Mortgage rates fell to their lowest level in over a month last week, with the 30-year fixed rate decreasing to 6.69 percent,” said Joel Kan, MBA’s Vice President and Deputy Chief Economist. “The recent strength in purchase activity continues, supported by lower rates and higher inventory levels, which are giving prospective buyers more options compared to earlier in the year. The purchase index increased for the fourth straight week to its highest level since January 2024. Conventional refinance applications declined despite the lower rates, but FHA and VA refinances rebounded from a week ago.
...
The average contract interest rate for 30-year fixed-rate mortgages with conforming loan balances ($766,550 or less) decreased to 6.69 percent from 6.86 percent, with points decreasing to 0.67 from 0.70 (including the origination fee) for 80 percent loan-to-value ratio (LTV) loans. The effective rate decreased from last week.
emphasis added
Mortgage Purchase IndexClick on graph for larger image.

The first graph shows the MBA mortgage purchase index.

According to the MBA, purchase activity is down 21% year-over-year unadjusted (due to timing of Thanksgiving - this was up sharply last week). 

Red is a four-week average (blue is weekly).  

Purchase application activity is up about 29% from the lows in late October 2023 and is now above the lowest levels during the housing bust.  

Mortgage Refinance Index
The second graph shows the refinance index since 1990.

With higher mortgage rates, the refinance index increased as mortgage rates declined in September but has decreased as rates moved back up.

AI and the 2024 Elections

It’s been the biggest year for elections in human history: 2024 is a “super-cycle” year in which 3.7 billion eligible voters in 72 countries had the chance to go the polls. These are also the first AI elections, where many feared that deepfakes and artificial intelligence-generated misinformation would overwhelm the democratic processes. As 2024 draws to a close, it’s instructive to take stock of how democracy did.

In a Pew survey of Americans from earlier this fall, nearly eight times as many respondents expected AI to be used for mostly bad purposes in the 2024 election as those who thought it would be used mostly for good. There are real concerns and risks in using AI in electoral politics, but it definitely has not been all bad.

The dreaded “death of truth” has not materialized—at least, not due to AI. And candidates are eagerly adopting AI in many places where it can be constructive, if used responsibly. But because this all happens inside a campaign, and largely in secret, the public often doesn’t see all the details.

Connecting with voters

One of the most impressive and beneficial uses of AI is language translation, and campaigns have started using it widely. Local governments in Japan and California and prominent politicians, including India Prime Minister Narenda Modi and New York City Mayor Eric Adams, used AI to translate meetings and speeches to their diverse constituents.

Even when politicians themselves aren’t speaking through AI, their constituents might be using it to listen to them. Google rolled out free translation services for an additional 110 languages this summer, available to billions of people in real time through their smartphones.

Other candidates used AI’s conversational capabilities to connect with voters. U.S. politicians Asa Hutchinson, Dean Phillips and Francis Suarez deployed chatbots of themselves in their presidential primary campaigns. The fringe candidate Jason Palmer beat Joe Biden in the American Samoan primary, at least partly thanks to using AI-generated emails, texts, audio and video. Pakistan’s former prime minister, Imran Khan, used an AI clone of his voice to deliver speeches from prison.

Perhaps the most effective use of this technology was in Japan, where an obscure and independent Tokyo gubernatorial candidate, Takahiro Anno, used an AI avatar to respond to 8,600 questions from voters and managed to come in fifth among a highly competitive field of 56 candidates.

Nuts and bolts

AIs have been used in political fundraising as well. Companies like Quiller and Tech for Campaigns market AIs to help draft fundraising emails. Other AI systems help candidates target particular donors with personalized messages. It’s notoriously difficult to measure the impact of these kinds of tools, and political consultants are cagey about what really works, but there’s clearly interest in continuing to use these technologies in campaign fundraising.

Polling has been highly mathematical for decades, and pollsters are constantly incorporating new technologies into their processes. Techniques range from using AI to distill voter sentiment from social networking platforms—something known as “social listening“—to creating synthetic voters that can answer tens of thousands of questions. Whether these AI applications will result in more accurate polls and strategic insights for campaigns remains to be seen, but there is promising research motivated by the ever-increasing challenge of reaching real humans with surveys.

On the political organizing side, AI assistants are being used for such diverse purposes as helping craft political messages and strategy, generating ads, drafting speeches and helping coordinate canvassing and get-out-the-vote efforts. In Argentina in 2023, both major presidential candidates used AI to develop campaign posters, videos and other materials.

In 2024, similar capabilities were almost certainly used in a variety of elections around the world. In the U.S., for example, a Georgia politician used AI to produce blog posts, campaign images and podcasts. Even standard productivity software suites like those from Adobe, Microsoft and Google now integrate AI features that are unavoidable—and perhaps very useful to campaigns. Other AI systems help advise candidates looking to run for higher office.

Fakes and counterfakes

And there was AI-created misinformation and propaganda, even though it was not as catastrophic as feared. Days before a Slovakian election in 2023, fake audio discussing election manipulation went viral. This kind of thing happened many times in 2024, but it’s unclear if any of it had any real effect.

In the U.S. presidential election, there was a lot of press after a robocall of a fake Joe Biden voice told New Hampshire voters not to vote in the Democratic primary, but that didn’t appear to make much of a difference in that vote. Similarly, AI-generated images from hurricane disaster areas didn’t seem to have much effect, and neither did a stream of AI-faked celebrity endorsements or viral deepfake images and videos misrepresenting candidates’ actions and seemingly designed to prey on their political weaknesses.

AI also played a role in protecting the information ecosystem. OpenAI used its own AI models to disrupt an Iranian foreign influence operation aimed at sowing division before the U.S. presidential election. While anyone can use AI tools today to generate convincing fake audio, images and text, and that capability is here to stay, tech platforms also use AI to automatically moderate content like hate speech and extremism. This is a positive use case, making content moderation more efficient and sparing humans from having to review the worst offenses, but there’s room for it to become more effective, more transparent and more equitable.

There is potential for AI models to be much more scalable and adaptable to more languages and countries than organizations of human moderators. But the implementations to date on platforms like Meta demonstrate that a lot more work needs to be done to make these systems fair and effective.

One thing that didn’t matter much in 2024 was corporate AI developers’ prohibitions on using their tools for politics. Despite market leader OpenAI’s emphasis on banning political uses and its use of AI to automatically reject a quarter-million requests to generate images of political candidates, the company’s enforcement has been ineffective and actual use is widespread.

The genie is loose

All of these trends—both good and bad—are likely to continue. As AI gets more powerful and capable, it is likely to infiltrate every aspect of politics. This will happen whether the AI’s performance is superhuman or suboptimal, whether it makes mistakes or not, and whether the balance of its use is positive or negative. All it takes is for one party, one campaign, one outside group, or even an individual to see an advantage in automation.

This essay was written with Nathan Sanders, and originally appeared in The Conversation.

Adultery is no longer a crime in New York State.

 Not only is jaywalking no longer a crime in New York City, the seldom-enforced criminal law against adultery in New York State has now been repealed. 

My sense is that the jaywalking ban was rolled back in part because it was inequitably enforced, while the ban on adultery was so rarely brought to trial that it was simply obsolete.

NPR has the story:

Adultery is no longer illegal in New York, By Ayana Archie 

"Adultery is no longer a crime in New York.

"Gov. Kathy Hochul on Friday signed off on repealing a 1907 law prohibiting the act.

"New York's penal law previously said that "a person is guilty of adultery when he engages in sexual intercourse with another person at a time when he has a living spouse, or the other person has a living spouse."

"It was considered a Class B misdemeanor, which carries a jail sentence of up to three months.

"The New York State Senate called the law "outdated."

#########

Interestingly, surveys indicate  both that most Americans disapprove of adultery, but that the frequency of adultery is quite high. So it's the law that is outdated, not the act.

Also interesting is that adultery is still forbidden under the Uniform Code of Military Justice.  This comes up in discussions about President Trump's nominee to be Secretary of Defense (where he will preside over servicemen and women who are forbidden to follow the examples of their Secretary and their Commander in Chief...)

Here's the NYT on that:

Pete Hegseth’s Mother Accused Her Son of Mistreating Women for Years  by Sharon LaFraniere and Julie Tate

"Reports of his infidelity have focused attention on his character and leadership, particularly for a civilian overseeing the military, where active-duty service members can be subject to prosecution for adultery under the Uniform Code of Military Justice."

Info Finance has a Future!

Info finance is Vitalik Buterin’s term for combining things like prediction markets and news. Indeed, a prediction market like Polymarket is “a betting site for the participants and a news site for everyone else.”

Here’s an incredible instantiation of the idea from Packy McCormick. As I understand it, betting odds are drawn from Polymarket, context is provided by Perplexity and Grok, a script is written by ChatGPT and read by an AI using Packy’s voice and a video is produced by combining with some simple visuals. All automated.

What’s really impressive, however, is that it works. I learned something from the final product. I can see reading this like a newspaper.

Info finance has a future!

Addendum: See also my in-depth a16z crypto podcast (AppleSpotify) talking with Kominers and Chokshi for more.

The post Info Finance has a Future! appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Frozen in time

A woman on a sofa injecting her abdomen surrounded by stuffed toys on a shelf.

China does not allow assisted reproduction for unmarried women. So now they travel to the US for egg-freezing treatments

- by Aeon Video

Watch at Aeon

Autumn Among the Galaxy Clusters

Autumn Among the Galaxy Clusters

The idea of moving stars as a way of concentrating mass for use by an advanced civilization – the topic of recent posts here – forces the question of whether such an effort wouldn’t be observable even by our far less advanced astronomy. In his paper on life’s response to dark energy and the need to offset the accelerating expansion of the cosmos, Dan Hooper analyzed the possibilities, pointing out that cultures billions of years older than our own may already be engaged in such activities. Can we see them?

I like Centauri Dreams reader Andrew Palfreyman’s comment that what astronomers know as the ‘Great Attractor’ is conceivably a technosignature, “albeit on a scale somewhat more grand than that cited.” An interesting thought! And sure, as some have pointed out, nudging these concepts around on a mental chess board is wildly speculative, but in the spirit of good science fiction, I say why not? We have a universe far older than our own planet with possibilities we might as well imagine.

If we turn our attention in the general direction of the constellation Centaurus and then look not at the paltry 4.3 light year distance of Alpha Centauri but 150–250 million light years from Earth, we encounter a region of mass concentration that folds within the Laniakea Supercluster. The latter is galactic structure at an extraordinary level, as it takes in some 100,000 galaxies including the Virgo Supercluster, and that means it takes in the Local Group and the Milky Way as well.

What’s happening is that this hard to observe region (it’s blocked by our own galaxy’s gas and dust) is evidently drawing many galaxies including the Milky Way towards itself. The speed of this motion is about 600 kilometers per second. Bear in mind that the Shapley Supercluster lies beyond the Great Attractor and is also implicated in the motion of galaxies and galaxy clusters in this direction. So the science fictional scenario has a civilization clustering matter at the largest scale to avoid the effects of the accelerating expansion that will eventually cut off anything that is not gravitationally bound. Cluster enough stars and you maintain your energy sources.

Image: Located on the border of Triangulum Australe (The Southern Triangle) and Norma (The Carpenter’s Square), this field covers part of the Norma Cluster (Abell 3627) as well as a dense area of our own galaxy, the Milky Way. The Norma Cluster is the closest massive galaxy cluster to the Milky Way, and lies about 220 million light-years away. The enormous mass concentrated here, and the consequent gravitational attraction, mean that this region of space is known to astronomers as the Great Attractor, and it dominates our region of the Universe. The largest galaxy visible in this image is ESO 137-002, a spiral galaxy seen edge on. In this image from Hubble, we see large regions of dust across the galaxy’s bulge. What we do not see here is the tail of glowing X-rays that has been observed extending out of the galaxy — but which is invisible to an optical telescope like Hubble. Credit: ESA/Hubble & NASA.

Recall the parameters of Dan Hooper’s paper, which posits the collection of stars in the range of 0.2 to 1 solar mass as the most attractive targets. The constraint is needed because high-mass stars will have lifetimes too short to make the journey (Hooper posits 0.1 c as the highest velocity available) to the collection zone. The idea is that the civilization will enclose lower-mass stars in something like Dyson Spheres, using these to collect the energy needed for propulsion of the stars themselves. Not your standard Dyson Sphere, but astronomical objects using propulsion that may be detectable.

Hooper doesn’t wade too deep into these waters, but here’s his thought on technosignatures:

From our vantage point, such a civilization would appear as a extended region, tens of Mpc in radius, with few or no perceivable stars lighter than approximately ∼2M☉ (as such stars will be surrounded by Dyson Spheres). Furthermore, unlike traditional Dyson Spheres, those stars that are currently en route to the central civilization could be visible as a result of the propulsion that they are currently undergoing. The propellant could plausibly take a wide range of forms, and we do not speculate here about its spectral or other signatures. That being said, such acceleration would necessarily require large amounts of energy and likely produce significant fluxes of electromagnetic radiation.

This is a different take on searching for Dyson Spheres than has been employed in the past, for in the ‘star harvesting’ scenario of Hooper, the spectrum of starlight from a galaxy that has already been harvested would be dominated by massive stars, with the lower mass stars being already enclosed. On this score, it’s also interesting to consider the continuing work of Jason Wright at Penn State, where an analysis of Dyson Spheres as potential energy extractors and computational engines is changing our previous conception of these objects, resulting in smaller, hotter observational signatures.

In the near future we’ll dig into the Wright paper, but for today it’s useful indeed, because it points to why we speculate on such a grand scale. Let me quote from its conclusion:

Real technological development around a star will be subject to many constraints and practical considerations that we probably cannot guess. While we have outlined the ultimate physical limits of Dyson spheres, consistent with Dyson’s philosophy and subject only to weak assumptions that there is a cost to acquiring mass, if real Dyson spheres exist, they might be quite different than we have imagined here.

And the key point:

Nonetheless, these conclusions can guide speculation into the nature of what sorts of Dyson spheres might exist, help interpret upper limits set by search programs, and potentially guide future searches.

But back to Hooper and the subject of Deep Time. For Hooper’s calculation is that all stars that are not gravitationally bound to the Local Group (which includes the Milky Way and Andromeda, among other things) will move beyond the cosmic horizon due to accelerating expansion on a timescale of 100 billion years. It will be autumn among the galaxy clusters, meaning that their energies will need to be harvested or rendered forever inaccessible. Our hypothetical advanced civilization will need to begin moving stars back toward their culture’s central hub. Hooper sees a civilization conducting such activities out to a range of several tens of Mpc, which boosts the total amount of energy available in the culture’s future by a factor of several thousand.

This is an application of Dyson Spheres far different from what Freeman Dyson worked with, and I agree with Jason Wright that technologies of this order are probably far beyond our current imaginings. But as Dyson himself said in a 1966 tribute to Hans Bethe: “My rule is, there is nothing so big nor so crazy that one out of a million technological societies may not feel itself driven to do, provided it is physically possible.”

The paper is Hooper, “Life versus dark energy: How an advanced civilization could resist the accelerating expansion of the universe,” Physics of the Dark Universe Volume 22 (December 2018), pp. 74-79. Abstract / Preprint. The Wright paper is “Application of the Thermodynamics of Radiation to Dyson Spheres as Work Extractors and Computational Engines, and their Observational Consequences,” The Astrophysical Journal Volume 956, No. 1 (5 October 2023), 34 (full text). I drew the Dyson quote from Wright’s paper, but its source is Dyson, Perspectives in modern physics: Essays in Honor of Hans A. Bethe on the Occasion of his 60th birthday, ed R. Marshak, J. Blaker and H. Bethe (New York: Interscience Publishers) July, 1966, p. 641.

Nicholas Bagley on DOGE

A unilateral pause won’t be as helpful as Musk and Ramaswamy seem to think. Many businesses, especially big businesses, have to certify their legal compliance to government agencies—most notably via financial reports to the Securities and Exchange Commission, where false certifications can trigger criminal penalties under Sarbanes-Oxley. Few will feel comfortable ignoring rules that are still on the books just because DOGE tells them they might someday be rescinded.

What’s more, you need smart bureaucrats to make sure that rescissions hold up in court. Under settled law, established way back in the Reagan administration, “an agency changing its course by rescinding a rule is obligated to supply a reasoned analysis for the change.” Compiling that analysis requires technical skills that agency bureaucrats will have and that DOGE will lack. Slashing the federal workforce will thus work at cross-purposes to deregulation.

Here is the whole piece, excellent analysis.

The post Nicholas Bagley on DOGE appeared first on Marginal REVOLUTION.

       

Comments

 

Avionics pods,

Things are progressing slowly but surely. I’m probably averaging 20hr per week of work on it, distracted only by manufacturing op-ed pieces 😂 It doesn't look like as much physical progress, but I'm feeling WAY better about the avionics now. It's not the hull floating that worries me, it's sending a complex computer system out into nature unattended. 4,000 lines of code so far...

Alright, update time, I made some mistakes. Not the end of the world—they’re already fixed!

The original plan was to fit all the electronics into a 6-inch Triclamp mast. I knew it would be tight, but there weren’t any viable options larger than 6 inches in diameter. I designed the PCB to fit this constraint, but it was an extremely snug arrangement with components like the MPPT, BMS, Raspberry Pi, cameras, etc. Once I started assembling the flight hardware, a few unmodeled details like ribbon cable headers started interfering with the side walls. I tried to thin things out wherever possible, but it became an increasingly frustrating packaging challenge.

Old avionics packaging trials

On the healing bench

Eventually, I called it. I pivoted to an IP66 fiber-reinforced plastic enclosure, a much more familiar setup with DIN rails and cable trays. Naturally, I chose one from McMaster. The only significant loss was some welding work; everything else transitioned smoothly into the new enclosure.

I love the fact they give CAD models for everything

Viewports all bonded on 

This change made life a lot easier, and I was able to assemble a near flight-ready prototype in just a couple of nights. The biggest question mark was RF performance of the Starlink through the lid. The internet wasn’t very helpful, with the general sentiment being PUT NOTHING ABOVE THE STARLINK. I found no measurable performance loss shooting through 4mm thick walls of fiber reinforced thermoset. I took the ETFE sticker off the top of the Mini and used 3m film adhesive to bond it to the roof of the enclosure lid, no going back after that move! Below, you can see the injection points from the hot gate runner system they used for the Starlink Mini enclosure. As AVE would say, sounds like glass fiber 30% PA.

Injection molding marks hiding behind the sticker on Starlink. I think my warranty might be voided!

Still needs some cleanup

The cameras were the biggest adjustment. Using 3D-printed drill guides and a hole saw, I cut dual 3-inch portholes in the enclosure. The viewports are made from ¼-inch waterjet-cut polycarbonate from SendCutSend, sealed to the box with Hysol. The cameras are mounted on ABS 3D-printed holders. After much deliberation, I’ve moved away from GoPros and fully committed to TP-Link Tapo C120 cameras. The selling point? They broadcast RTSP streams over WiFi and are rock-solid in terms of reliability. While they’re rated for indoor use, the weatherproof enclosure keeps them perfectly safe. I’m running two forward-facing cameras for redundancy—if one viewport gets obstructed, I’ll still have a backup. If I had more bandwidth, I’d consider an F1-style setup with an indexable film over the lenses for a clean view. Might need to find hydrophobic gorilla glass projects for polcarb?

Drill guides

Love the Bambu

The camera software has been trickier than anticipated. After a few weeks of fiddling with Docker scripts, I’ve finally achieved something stable. The workflow involves capturing 1080p 30fps raw footage and storing it on an SSD during daylight hours. For streaming, I downsample to 720p 20fps with a lower bitrate to send via Starlink to YouTube Live. I’ll also compile the day’s footage into a 100x speed timelapse. Ensuring resilience to frame drops and corruption has been a priority—FFMPEG has been temperamental in this regard. I settled on Frigate as a standalone Docker container for video capture, retention, and browser-based review. It’s been a lifesaver after some earlier mishaps where the Pi’s disk filled up, causing major crashes.

New hardware test cart!

300w sunpower solar cells

The hardware-in-the-loop (HITL) testbed has also evolved. It started as a DIN rail table mounted to a utility cart. Now that the real batteries are installed in the hull spine, I needed a way to simulate flight conditions more realistically. Being able to detach the setup from the solar panels and wheel it into the middle of a parking lot with a clear view of the sky has been a game-changer. The Iridium module, in particular, acquires a signal much faster when it has an unobstructed 180-degree view.

Other systems are progressing well. The Iridium and GPS antennas work effectively, even blasting through the Starlink itself, which avoids the need for external SMA antennas. Load management through the Pi/ESP32 combo allows selective shutdown of power-hungry devices at night, conserving precious energy. During testing, I shorted an auxiliary channel and confirmed the Infineon smart FET’s current limiting worked as advertised—justifying the work on the PCB.

The magnetometer, IMU, and gyro are all operational. The magnetometer provides accurate heading data without GPS velocity, thanks to the MMC5983MA. The IMU and gyro are mostly for fun but might eventually inform power-saving features, like shutting off Starlink during extreme seas. The security guards were probably wondering why I was in the back lot, spinning the thing 90deg every few minutes. Got to check the heading calibrations!

Little buddy has no idea of the journey he's going to go on....

At least there was an Arduino example I could port to Python

Board working great

Doing calibrations

The hull is still awaiting its turn at the CNC router. Foam is in hand, and fiberglass supplies are ready to go. Just waiting on machine time. *insert waiting Pablo meme 😄*

The rudder system is coming together with a simple lever-arm tiller configuration. It’s driven by an IP69K-rated stepper actuator from Tolomatic (a great surplus find on eBay). I hate fooling around with seals, better to let someone else engineer that. This gives 4 inches of stroke, driving the tiller for ±30 degrees of rudder swing. The pivot uses Rulon bushings on a ¾-inch 316 stainless shaft, welded to a slab-style rudder roughly 1 square foot in size. The keel will provide most of the straight-line stability, so the actuator will only make occasional adjustments to conserve power.

Am a sucker for a good ebay deal

The battery is still going strong! 8s1p 50ah is the final flight configuration. After 50 cycles, the cells seem to stay in balance nicely. Looking forward to someone building a nice easy to integrate all in one MPPT + BMS with opensource Python code...

Diy 50ah 8s1p lifep4 cylindrical cell battery for the 3in triclamp spine tube 

In software land, the number of edge cases causing non-recoverable faults is steadily shrinking. The long tail of things is mostly quality of life stuff with operating the camera streams and power management. I can’t run a strict time based schedule, as the solar panels produce a different amount of energy every day as cloud cover changes. I’m right on the coast of LA, so the marine layer is fairly regular, and I’m not getting ideal power levels. The large power consumers (motor, Starlink, cameras) need to be dynamically scheduled based on actual amp-hours of storage accumulated the day before. I wish I had room for more than 300w of solar, but at 15ft it was already quite big.

Too much programming, not enough machining lately

The ESP32 is acting as a supervisor board, with the ability to override any load switch FET and perform a power cycle on a per-channel basis through Iridium. The board also performs automatic reboots of the Pi if its watchdog pin goes low (indicating a failure to boot).

Have I mentioned that I hate devops? It’s great, but seriously, the biggest pile of fiddly little problems. Tailscale is great, until you are running a container server side and a container boat side and each one needs to resolve DNS names across constant Starlink network drops. How to configure health checks to auto reboot the container if MagicDNS is stale. Blah blah blah, boring but important details.

Still loving the fact that I can pull up Grafana from the airplane to check on the status of my system. I finally migrated the InfluxDB and Grafana to a server on land, instead of it living on the boat. This way I get uninterrupted access to the data while the Starlink is off. Even configuring the Telegraf service to do proper disc based buffering during blackout periods was a bit of a pain.

Next steps:

  • Hull CNC foam cutting

  • Fiberglass skin

  • Mount prop drive system / endurance submersion testing

  • Machine rudder parts / Integrate actuator

  • Finish flight wiring harnesses

  • Stop writing software

The end of oil?

It is now plausible to envision scenarios in which global demand for crude oil falls to essentially zero by the end of this century, driven by improvements in clean energy technologies, adoption of stringent climate policies, or both. This paper asks what such a demand decline, when anticipated, might mean for global oil supply. One possibility is the well-known “green paradox”: because oil is an exhaustible resource, producers may accelerate near-term extraction in order to beat the demand decline. This reaction would increase near-term CO2 emissions and could possibly even lead the total present value of climate damages to be greater than if demand had not declined at all. However, because oil extraction requires potentially long-lived investments in wells and other infrastructure, the opposite may occur: an anticipated demand decline reduces producers’ investment rates, decreasing near-term oil production and CO2 emissions. To evaluate whether this disinvestment effect outweighs the green paradox, or vice-versa, I develop a tractable model of global oil supply that incorporates both effects, while also capturing industry features such as heterogeneous producers, exercise of market power by low-cost OPEC producers, and marginal drilling costs that increase with the rate of drilling. I find that for model inputs with the strongest empirical support, the disinvestment effect outweighs the traditional green paradox. In order for anticipation effects on net to substantially increase cumulative global oil extraction, I find that industry investments must have short time horizons, and that producers must have discount rates that are comparable to U.S. treasury bill rates.

That is from a new NBER working paper by Ryan Kellogg.

The post The end of oil? appeared first on Marginal REVOLUTION.

       

Comments

Related Stories

 

Wednesday: ADP Employment, ISM Services, Fed Chair Powell Discussion, Beige Book

Mortgage Rates Note: Mortgage rates are from MortgageNewsDaily.com and are for top tier scenarios.

Wednesday:
• At 7:00 AM ET, The Mortgage Bankers Association (MBA) will release the results for the mortgage purchase applications index.

• At 8:15 AM, The ADP Employment Report for November. This report is for private payrolls only (no government).  The consensus is for 166,000 jobs added, down from 233,000 in October.

• At 10:00 AM, the ISM Services Index for November.  The consensus is for 55.5, down from 56.0.

• At 1:45 PM, Discussion, Fed Chair Jerome Powell, Moderated Discussion, At the New York Times DealBook Summit, New York, N.Y.

• At 2:00 PM, the Federal Reserve Beige Book, an informal review by the Federal Reserve Banks of current economic conditions in their Districts.

Stereo Jupiter near Opposition

Jupiter looks sharp in these two Jupiter looks sharp in these two