michael-dean-k/

Topic

incentives

7 pieces

Revolutionary tax proposal #1

· 193 words

Revolutionary tax proposal #1: anything above $100m/year is taxed at 90%, but in exchange for political equity in the country (ie: delegation and voting). It formalizes lobbying and makes it transparent. To prevent the rich from steering the country too hard in their favor, there can be provisions where legislators, citizens, and oligarchs have checks and balances. Ie: to put it kidishly simple, each can have a 33% stake in directing that taxation. Another way to think of it is forced investment in pre-approved pro-American funds, companies, etc.

TBF: I have little sense of what I'm talking about in these matters. But the general context behind this is that power dynamics organically took control of the country and defied the spirit of the founding architecture. I assume there are many examples on how the Constitution and it's amendments dit not protect the original vision. And so the principal is to understand how power actually moves and work with it; don't kill it or shun it, but formalize it into legal structures, make it transparent, and then force it to comply with specific standards that muzzle and channel it's wolf power.

Institutes vs. Institutions

· 370 words

When we say we "distrust institutions," we're pointing at the wrong thing; it's the institutes that are withering. We use these words interchangeably, but I think the separation clarifies.

An "institution" is an abstract, permanent, inter-generational primitive—like education, marriage, the free press, the essay—while an "institute" is a concrete embodiment that serves it. Think of an institution as a societal organ. Think of institutes as the specialized tissue that keep the organ functioning and regenerating.

As generations turn, new sets of people are handed down the great responsibility to protect and evolve institutes through the storms of time and technology. Without upgrading our institutes, society goes through slow-motion organ failure, with phantom pains and spiritual malaise that can't be traced back to the source. Schools still look like schools, but everyone is cheating through a Homework Apocalypse, and suddenly we have all sorts of cultural cancers that seem inevitable. Institutes are the civic building blocks of a sane society, and yet we glorify unicorns who create "value" but feel no responsibility for their dying elders.

Institutes operate through the inverse of market logic. Where startups are designed to accrue all of the upside, an institute is sacrificial, designed so society gets the upside, even at its own peril. Of course they swim in the same water, but institutes swim differently: they have opposite answers to questions on how to steer, what to make, where to focus, who to include, and when to stop. An attempt at some principles:

  • mission-driven, not market-driven;
  • timeless contributions, not self-serving content;
  • involved in ecosystem building, not niche extraction;
  • active members, not passive users;
  • century-long legacy, not liquidity through an exit.

Usually an institute comes from patronage: you can’t resist market currents unless you’re supported by endowments, donations, foundations, tuitions, grants, and such things. You can’t start an institute in your garage, but now with AI and the collapse of cost, I suppose you could try. So many of the one-person AI company fantasies are about a single founder reaching a billion-dollar valuation, which is the cheapest form of ambition there is; the better question is around the scale and spirit of cultural impact achievable by a one-person micro-institute.

→ source

Quality Algorithm

· 437 words

“The Internet needs a quality algorithm.” This was the opening line of my essay prize announcement, and I want to revisit it now that it's done. Is there a correlation between writing quality and audience size? 

Algorithms are low-trust right now because they’re adversarial—“for you” gaslighting (usually)—and they reward engagement, popularity, monetization, etc. The 2010s-era algorithms are based on discrete events: clicks, likes, measurable things. They might look at keywords to guess the topic of an essay, but it’s effectively blind to the overall quality of a piece. Quality is nebulous, after all. Small magazines can each have their own vision of what’s good, but for a million/billion-person network, there’s no consensus, and quantity is way more important anyway.

So this essay competition was a v1 attempt to define and search for quality. The overall search space was small, but it was a chance to experiment with curation, and resulted in The Best Internet Essays 2025. It’s interesting to me that the featured writers ended up varying in audience size, evenly distributed between 10s, to 100s, to 1,000s, to 10,000+ subscribers.

Again, limited sample, but interesting to ponder: the tangible thing (reach) is a power law distribution (1% have big audiences), but the intangible thing (quality), the thing that matters more, is independent of scale. It means that for all the great writers with 10k audiences who are highly visible, there are possibly 100x writers of similar caliber who are undiscovered, in algorithmic obscurity. 

This isn’t too surprising, and the usual reply is, “well it’s not enough to write well, it’s your responsibility to be consistent, to be your own marketer and publicist, to make sure your work gets read.” I get that this is what’s been required, but what if it weren’t? Wouldn’t it be better if a platform could search for quality at scale so writers could just do their thing? This would also give visibility to those who aren't full-time writers, people who publish 1-2 essays per year around the interesting problems they’re working on, but have no bandwidth to build an audience each week.

Still have to think through v2, the 2026 prize, but the question in my mind is how can I expand the search space? Can I have agents scan the Internet, assemble RSS feeds to find great essays, design an algorithm to filter for the previously intangible, build community into the process, and then curate/share the stuff that comes through? The aspiration is to get better each year at surfacing great essays from independent writers on the basis of merit, and this book is what came through the first pass.

→ source

Do paid subscribers influence discovery on Substack?

· 546 words

Chris Best, founder of Substack, posted that they caught “President Plump,” the #1 growing account on Substack, for using fake subscriptions to boost discovery. I think this was intended to comfort everyone that they caught a scammer (justice!), but actually it confirmed what many were starting to notice: discovery is contingent on you making money. If you have paid subscribers turned off, no algorithmic wind will blow your way. But if you have a spike of paid subscribers in a month, suddenly your old posts will start to go viral, in hopes that even more paid subscribers will bring the platform 10% (this has happened to me before). This isn’t inherently bad. For every President Plump, there is an earnest person trying to finance their creative project.

But at scale I fear it creates a bad pattern, because the accounts that everyone sees will be the ones making the most, and generally these will be marketers and growth hackers more than artists. I think you will find better writing in the gutters of Substack than on their rising leaderboard. If authentic culture emerges outside of monetization, then there’s a real rift between what Substack wants to be (“an engine for culture”) and what it actually is (an algorithm that only rewards monetization).

I think the best we can do is use this information to our advantage. For example, I could have new Essay Club members pay directly through Stripe, but by handling payments through my Founding Members tier on Substack, I get a discovery boost, which is worth the 10% fee. Similarly, if you make small digital products, it might make sense to bundle them into a subscription instead of charging per item.

Should you use a credit card masking service to give yourself 20 paid subscriptions for $5 each? Depends. Basically, for $10/month, you can pay for a probably noticeable increase in discovery. The question is, will you get caught? Maybe they are on the lookout now, but my guess is they would only penalize it at a certain scale. Sam Kriss speculated that President Plump was paying himself around $5,000 per month to reach #1. I’ve never done this, and wouldn’t necessarily recommend it unless you have a hacker mentality and really need the growth. 

At the very least, you should consider having paid subscriptions turned on. Cate Hall found success in charging $1/month and getting to #1 rising. Our very own Yehudis Milchtein also set up $1/month subscriptions and is now #91 rising in literature.

However you approach this, it brings up a bigger question for me on how to build a real engine for culture. It seems like you can’t have an algorithm for a single reward (popularity or money) or else they will be gamed; instead you could give everyone curatorial power relative to their cultural reputation, however you measure that. For example, if we all trust Ted Gioia, then somehow Ted’s like should count more than 10,000 bot likes or $1,000 in fake subscriptions.

I hope this triggers more transparency from Substack on how their algorithm works, and also hope for a new generation of platforms where each person has visibility into and control of the thing that is routing them information.

Software Incentives

· 449 words

One of the thrills of the AI revolution will be how it untangles software from bad incentives. Today, software is expensive to build and maintain, and so it needs returns to fund itself. The big social media companies have annual expenses of $50m-$50b; they are in no position to operate from virtues, or to deliver on their stated aspirations of “connecting the world,” because they need to optimize for attention and convert it to revenue to fund the ridiculous scale of the operation.

But now we’ve hit the point where autonomous coding is real: Claude’s Opus 4.5 can code for 30 hours straight. I am currently “rebuilding Circle,” the community platform, except not as a platform, but as a single customized instance for my community (Essay Club). I am maybe 4 hours in and half way done. Circle wanted $1k/year, so I built my own with a $20/mo Cursor subscription.

When you can just prompt software into existence, you don’t need fundraising, an expanding team, and all the sacrifices that come with capital. Software can start reflecting the will of visionaries, rather than the exploited psyches of the masses. Of course, AI coding will also enable huckster bot swarms to sell Candy Crush clones and other brain rot variants, but more importantly I think we’re entering a new era of techno-activism.

Millions will use their weekends to spin up apps, sites, tools, platforms, and networks, not for the sake of colonizing the planet’s attention, but for the sake of gift-giving or mischief-making or culture-shaping. It could mean that we shift our attention from hyper-commoditized feeds to mission-driven places.

Today, I think a single person could spin up a million-person writing-based network for under $100k/year (my guess is that’s <0.2% of Substack’s cost). If you clone something exactly (like Twitter>Bluesky), there’s little reason to switch because you lose the network effects. But the oozification of code & interface means that we can start experimenting with better social architectures. How might a network built for human flourishing actually function? A novel concept paired with a small critical mass (just a few hundred people) might be enough to trigger a cascade of platform switching.

The irony is that AI coding is only possible because big companies have been able to amass extreme amounts of capital, resources, and data, but in doing so they’ve released something that could erode their own monopolies on attention, the last scarce resource. Now I think it comes down to what people decide to build. If everyone can build anything, will we each try to build our own empire of extraction, or will we contribute to a culture we want to live in ourselves?

→ source

Why doesn't Substack create funds for it's on-platform creators?

· 232 words

I didn’t realize that Substack is open about paying off-platform creators to join their platform. See their $20m accelerator fund. My quick understanding is that, if you make $X revenue/year elsewhere, they guarantee you’ll make that, and will make up the difference if after a year, you don’t. A friend thinks there’s an additional secret fund that pays bonuses for celebrities to join (ie: Dolly Parton, Charlie XCX). I was surprised by how articulate Charlie XCX was—I only have a meme-level understanding of her—but I suppose it’s possibly ghostwritten. Idk.

I don’t have problems with this, but what doesn’t register to me is why they wouldn’t allocate money to help the on-platform, original writers. Obviously, these kinds of things piss of 95% of their userbase. Even if there was something like $100-$1m for on-platform writers with audiences under 1,000, that would build a tremendous amount of goodwill. My guess (and fear) is that they have a business model blindness, and aren’t thinking along the planes of “what actually builds organic culture?” Instead, there’s a lot of rationalizing: “here’s why bringing Derek Thompson on platform is good for you” (but the obvious benefit comes from the 10% they get from DT).

It’s weird to me that in some sense I’m giving more to it’s existing writers ($10,000), than the platform that raised $100,000,000.

Substack's business model blinders

· 200 words

Just heard Hamish (on a livestream) say that Substack is a revolution, a “found economy,” that materialized 5 million paid subscriptions that wouldn’t have existed otherwise. What is a revolution though? I think I want to zoom into this positioning, because many words are being used interchangeably. Yes, it’s a new business model for monetization, but is that a “cultural revolution”?

It feels like there’s a bit of a fixation on the 10% mechanism, and the risk is that this reward function turns Substack into LinkedIn in the next 3 years. If the goal is to make a “culture engine,” you need to really ask what a culture is. If you’re culture is limited to paid subscriptions, it’s a small, unrepresentative, utilitarian culture, much more slanted to journalism and business tactics, regardless of an editorial attempt to bring a flair of literature.

We need to define culture (in terms of taste, values, and quality), and then make platform design decisions that have nothing to do with revenue. Of course, I’m not saying to abandon revenue focus; I’m saying that they need to allocate some percent of their attention to “doing weird things” to prevent a writer exodus as enshittifcation strengthens.