seems quite wrong to me.
If anything the trend seems to go the other way - when I was younger pre internet most communication was face to face or voice over the phone.
Now the predominant thing seems text - sms, whatsapp, this text box I'm typing into now. I saw a stat the other day that online / app dating had gone from a minority to over 50% of how couple meet. And that is mostly a combination of some photos and text. Be able to write text or fade from the gene pool!
That said long form text may be different but those who write novels and the like were always a minority.
(source for the dating thing - not sure how accurate but kind of scary https://www.reddit.com/r/interestingasfuck/comments/1fzqgvk/...)
It isn't anymore, not for newer generations - e.g. Gen Z spending most of their time on Tiktok and phones, and not knowing how to use a word processor.
In the span of ~30 years pg is talking about I can absolutely imagine some job where you speak to the AI and it writes the documents for you and you never learned how to write one yourself. It will not be a good job but millions of people will hold it. They will not be able to write with much sophistication themselves ergo they will not be able to think with much sophistication either.
Online dating is not about writing. It was before Tinder, but it's not anymore. Like Instagram, it's about being skilled with photo filters and/or hiring a professional photographer. No one bothers to hire a profile writer - because no one reads the profile.
If the other person's photos are hawt you will click a button and the AI will send some funny jokes and if you're hawt too you'll share locations and shag. Idiocracy or some Eloi/Morlocks world will be real
>"Don't you hate Tuesdays?" "AHHHHHH"
so not really long-form essays. Maybe the future is that stuff?
This is a classic fallacy as old as society. “Whatever the hoi polloi are doing is by definition not the good stuff”. But long-term whatever the masses are doing always wins.
You know Shakespeare? He was the rube who thought plays could be entertaining to the masses. How quaint and silly, who would expect a commoner to appreciate a play. pfft.
Mozart? Taylor Swift of his day.
Printing press? Don’t even get me started, ew the commoners think they can just, like, write things down? How rude.
I’m as much an anti-fan of the short video communication trend as anyone, but it works. When bandwidth is cheap and video recording ubiquitous, video is a great medium. Who cares what you say, show me.
edit to add an uncomfortable truth: The in-crowd talks to develop ideas. What you see in writing is weeks, months, or even years behind the bleeding edge.
At no point did you address whether the shifting habits of younger generations will be bad for their literacy, instead making a general point that new trends in society are routinely panned by older members of such a society.
As a counterpoint, before radio and the phonograph, musical ability was quite widespread. Now, it's much rarer.
You haven't even attempted to address whether various developments in society and technology might do this to literacy the way earlier trends did to musical skills. I think that result is quite likely, by the way.
Fair, I was making a different point. Yes literacy might be reduced, my argument is that this isn’t necessarily a problem. Our abilities shift to take advantage of technology.
A lot like how we got really bad at memorizing long epics because we can just write them down instead.
That said, I don’t think writing/literacy will go away as much as we might fear. The new technologies are not a good enough replacement (yet?)
More going outside for you.
I think this is the article’s point - that this minority is going to shrink even more.
For most of history, writers were a tiny minority. It exploded 100x in the last few decades. If it goes down 10x, it's still way above where we were in the 1800s.
I know that the concept of dark ages is overblown, but still - something about relying on AI like this makes me think of the end of classical antiquity.
I agree with PG on this point and have noticed that people around me are often surprised when they receive well-written wsapp/sms messages that include proper punctuation and other linguistic markers. Additionally, many people rarely engage in handwriting today, and handwriting is known to improve clear thinking and literacy skills.
To begin with, the following assumption is false:
>To write well you have to think clearly, and thinking clearly is hard.
For most people, most life situations which require clear thinking have nothing to do with writing.
>This is why eminent professors often turn out to have resorted to plagiarism.
What's the percentage of such professors ? In the university I studied, there is no case of plagiarism till today. And plagiarism is not done because professors can't write, but due to other professional factors.
>If you're thinking without writing, you only think you're thinking.
As if writing is the only way to think well/correctly/effectively. My father never wrote a word: still, some of the most thoughtful statements I ever heard in my life were told to me by him during our conversations.
When you face a situation of danger, such as a wolf is running towards you: will you start to write your thoughts about what you should do, or will you just run right away and decide about the safest paths to follow while you are escaping ?
The problem with "clear thinking" is that it is subjective. I think Paul Graham and Leslie Lamport, have experienced something like this: when they sit down to write about a certain topic, they realize that their initial thoughts were not nearly clear enough, and after a number of iterations they became clearer and clearer. Most of us don't write essays, so we simply don't recognize this feeling.
You: what nonsense. Clearly, B does not necessarily require A, and yet he says it does, how poorly argued.
I meant: since most life situation where we need clear thinking do not involve writing, then we are obviously well equipped to think clearly.
And if thinking clearly is not that problematic for most people, then the author can't say we can't write because thinking clearly is hard/or we can't think clearly.
Got it ?
> "I meant: since most life situation where we need clear thinking do not involve writing, then we are obviously well equipped to think clearly."
That's not the QED you seem to think it is. The statement that "most life situation where we need clear thinking do not involve writing" doesn't give any reason to think that most people are good at clear thinking most of the time, nor whether people find clear thinking easier with the help of writing or if writing has no benefit to the goal of clear thinking. You're just putting two opinions you have next to each other and acting like one confirms the other.
And a friendly tip, "have I explained better what I meant before?" would come off as a lot more polite than "got it?", which to anyone who agrees with the rest of your comment could easily read as snide/patronising, while anyone who thinks you're still wrong will see it as smug and wrongly confident. (Apologies if English isn't your first language, in which case you're very good at it, and apologies if you didn't want unsolicited opinions on how your choice of language makes you seem in my view!)
edit to give an analogy: I feel your argument is like if somebody said "control of body movement is key to being a great athlete", and you replied "everyone is always controlling their body movement, clearly therefore it's not relevant to how good an athlete is".
> "have I explained better what I meant before?" would come off as a lot more polite than "got it?"
Thank you very much.
PS. English is not my native language.
Stephen Hawking is the first example that comes to mind.
He developed a remarkable ability to perform complex calculations and visualize intricate mathematical concepts entirely in his mind. He once mentioned that his ALS diagnosis, which limited his physical abilities, led him to focus intensely on theoretical physics, as it required more intellectual than physical effort.
But sure, writing (and drawing) is a great tool to aid in deep thinking. So are AI tools.
PG is obviously talking about the mental process of writing, i.e. of organizing a complex network of thoughts in a linear hierarchy that others can grasp, not the physical one.
You're correct here.
> Stephen Hawking is the first example that comes to mind.
The post is obviously speaking of the general population or at best average professional, and in my opinion choosing one of the most brilliant exceptional scientific minds of our lifetimes is not a good counterargument for a piece that speaks of a potential problem with society at large.
Strange example to pick as someone who did not write.
Stephen Hawking's thinking and imagination wouldn't have meant much had he not finally penned them down for others to read, and neither would his ideas have been taken seriously had he chosen to make tiktoks or podcasts to explain them instead.
You have committed the Fallacy of the Inverse.
Most of us have neither the intellect of Hawking nor his situation.
Sure some will thoughtlessly copy and paste but for many AI helps to structure their thoughts and they think clearer as a result.
a) No / little data: Whenever you are starting to think about a subject, you can ask it to give you a structure / categories.
b) Existing data: What I do very often is to give it a lot of "raw data" like unstructured thoughts or an unstructured article, then I ask him to find suitable top categories.
For me it’s very important to emphasize that AI is a tool. You have to use it responsibly. But there is no reason not to use it.
Until it's not.
I'm not the type who'd say "don't use AI". Use whatever works. Myself I became really fascinated by transformer LLMs / GPTs in winter 2019, then again when ChatGPT was published and a good few months after that.
It's just that my interest&enthusiasm has almosted vanished by now. Surely it will reemerge at some point.
This observation of Paul Graham may generalize beyond writing: modern technology appears to turn populations into bi-modally distributed populations - for example, those that write/consume human-written prose and those that produce/consume AI-generated prose; those that can afford human medical doctors and those that can only afford to consult ChatMedicGPT or Wikipedia; those that can afford human teachers for their childrens and those that let EduGPT train them, etc. Generally speaking, I expect a trend that more affluent people will use higher quality human services and the rest has to live with automation output.
It's interesting to think of humans as being like a premium service where AI's are a sort of knock-off/budget human service.
Unlike younger generations, who are growing up surrounded by AI-generated content, many of us older folks have had the experience of engaging directly with people and evaluating their competence. We developed a knack for quickly determining someone's skill level through just a few minutes of face-to-face conversation—a skill that was essential for navigating various life situations.
Now that anyone can use AI to generate seemingly competent text, videos, and more, and as in-person interactions decline, the conditions that once allowed us to gauge competence are fading. I worry that in the future, no one—including AI trained on our outputs—will be adept at making these assessments.
Those of us who take time to carefully compose arguments and revise them, as Paul suggests, will have a better handle on this, so that's a helpful consideration.
I worry strongly about a future like that in Ideocracy[1], where nobody has a clue bout actually judge competence, and instead goes with the best sound bites.
The one path out that I can see, and it's unlikely, is to teach the skill of explicitly tracking history, and reviewing how well someone predicted the future, over time.
The explicit generation and curation of a reputation is part of that priceless nexus that they'll all be seeking in future generations, and yet it'll pale in comparison with the ability to size someone up in a few minutes of interaction.
- Only a brain chip could make AI usage undetectable in practice. Without that you can tell if the person is checking his phone etc. Though you're right that an in-person interaction will be needed, otherwise there's no way of knowing what the other person is doing or if he's a real person at all... And since the latter problem (dead internet) will only grow, perhaps beyond the rectifiable, in-person communication will surely be in business again.
- Once AI replacement of competent humans has reached a certain threshold, what do you stand to gain from testing a human's level thereof? Are you interviewing for "above AI" positions? If not, relying on AI will be as normal as relying on a calculator.
I think I have a bit of this knack, in some areas, tempered by an awareness of some of my blind spots, but most people don't even claim to have this knack...
As evidence from our own field: before the explosion of LLM cheating, we had the explosion of Leetcode hazing.
Because, supposedly, good experienced software developers couldn't plausibly recognize each other just by talking with each other.
So instead we whip out these douchetastic did-you-prep-for-this rituals. And afterwards you still have no idea what the other person would be like to work with (except that you now know both of you are willing to play fratbro nonsense games).
I been intentionally changing up my candor so that people
Who get caught up in the structure, lose the message
If u know you know
Just look at the quality of presidential debates and political discourse we've been having for the past decade. Not just in the US, but all over the world. The situation is perilous.
Lagniappe: https://www.youtube.com/watch?v=EwPnJXXX5Ic
As someone said, it's no wonder The Matrix chose the 90s as the peak of human civilization.
What is the mechanism you propose, by which the birth of a child makes every person now living incrementally dumber?
(Kids, if you ever see this, I'm not saying it wasn't worth it. But seriously, 2AM every night for months?)
IQ, (from “intelligence quotient”), is a number used to express the relative intelligence of a person.
So for the whole population it is constant by definition :)
And heading in the direction of Dick and Jane doesn't mean that we ever reached it; according to https://community.jmp.com/t5/image/serverpage/image-id/8926i... recent SOTUs should be comprehensible to high school freshmen.
(full discussion at https://community.jmp.com/t5/JMPer-Cable/Regression-to-model... ; through 2018 — anyone have 2022?)
EDIT: keep in mind that the expected "general" audience for the SOTU has also expanded dramatically due to technological change in between 1790 and 2018...
It's interesting to be usually cautious but then predict something so radical, and yet with no real argument other than "AI is gonna replace us".
Painting should have been replaced by photography, but it hasn't been. In my opinion, there are still plenty of people who want to write, so there will still be plenty of people who know how to write.
And maybe my opinion is wrong, because it's an opinion. But to have to transform it to a certainty, I'd have to see much, much more data than a feeling and a conviction.
Writing may become the same thing. In the workplace, if someone is writing, they're probably doing it for their own entertainment. Some people write at home, writing journals, blogs, etc. Nobody will know that you're writing, unless it affects your thinking, and your thinking affects your work.
I think we already reached the stage where people stopped writing, before AI entered the picture. I rarely see anybody write a lengthy report any more. Reports have been replaced by PowerPoint, chat, e-mail, etc. One consequence is that knowledge is quickly lost. Or, it's developed by writing, but is communicated verbally.
Hopefully I'll live the couple of decades to find out if PG's prediction is correct, I would bet against it.
Here however, I do agree with his articulation -- "writing is thinking" -- and like you, I've thought a bit about the linear nature of writing.
My view is that the "jumble" of ideas/concepts/perspectives is just that -- a jumbled mess -- and the process of linearizing that mess requires certain cognitive aspects that we (humans) generally consider as constituting intelligence. IMO, the rapid generation of grammatically-correct + coherent linear sequences by LLMs is one reason some folks ascribe "intelligence" to them.
I liked his analogy about how the disappearance of widespread physical work meant that one now had to intentionally invest Time and Effort (at the gym) to maintain physical health. The facile nature of LLMs' "spitting out a linear sequence of words" will mean fewer and fewer people will continue to exercise the mental muscles to do that linearization on their own (unassisted by AI), and consequently, will experience widespread atrophy thereof.
I suspect thinking is similar, which brings up questions about LLMs as well. We all can now quickly write hundreds of generic business plans, but knowing what to focus on first is still the hard part.
I'm seeing that happen today with corporate documents (there's always that one enthusiast in each team who says "oh, let me improve that with [LLM]", and it's a slog to go through pages and pages of things that could be a bullet point). Quality has been trumped by mediocre quantity, and the cluelessness of the people who do this willingly baffles me.
As someone who's been writing pretty much constantly for over 30 years and both uses AI code completion to speed up first drafts (of code) but switches off everything except macOS's (pretty good) word completion for prose--and absolutely refuses to use AI to generate work documents or even my blog post drafts--this post was a bit of a "oh, so this would be the ultimate consequence of keeping all of it on" moment.
Accelerating writing (with simple word completion) is fine. But letting the AI generate entire sentences or paragraphs (or even expand your draft's bullet points) will either have you stray from your original intent in writing or generate overly verbose banalities that only waste people's time.
I use iA Writer to draft a lot of my stuff, and its style checks have helped a lot to remove redundancies, clichés and filler, making my prose a bit more cohesive and to the point. That's been around for ages and it's not LLM-style AI (more of a traditional grammar checker), but that sort of assistance seems to be missing from pretty much every AI "writing aid"--they just generate verbose slop, and until that is fixed in a way that truly helps a writer LLMs are just a party trick.
Edit: I just realised that I wrote about this at length in February - https://taoofmac.com/space/blog/2024/02/24/1600#the-enshitti...
During the typewriter era, anyone's ability to produce pages and pages of text was limited by their ability to type. Nowadays, you can copy/paste large blocks of code and thus inflate documents to enormous sizes. Which works just like sand in a gearbox of the decision-making process.
I wonder if the future of ESG, DEI and such is that one AI will produce endless reports and another AI will check if the correct buzzwords are used in correct frequency. And instead of yearly reports, they could easily become daily or hourly reports...
It would be a way to tout "allyship" on the social networks without actually doing anything substantial.
* Less than 1% of your writing will be life-changing.
* 3% will be trivial to write.
* 4% will strongly resonate with others in a way you didn’t expect.
* 5% will be quite good.
* 15% probably should’ve never been published.
* 26% will elicit a reaction you did not expect. Positive or negative.
* 28% will become vastly better because you chose to edit.
* 30% will start as one piece but finish as another.
* 40% will be good solid writing.
* 45% will do much worse than you expect when published.
* 60% of your writing will never be finished. Be ok with that.
* 100% of your writing is worth your time.
Just curious, but do you think my collective 20 years of random posts across 10+ social media platforms, often with thousands of posts per platform, has been worth my time?
May be hard to answer. My view from the same behavior: yes, worth it.
Was everything productive for a career? For a relationship with someone else? Sometimes yes. Sometimes no.
Would love to hear your thoughts on my first question!
Sounds like some of the 15% or some of the 28%. :)
But the real issue is people that are not already engaged and knowledgeable about what one another are doing, the key moment when a non-tech needs to discuss a tech need with someone from the tech developer sphere: can they even communicate, and I'm not talking through a salesperson, but actually discuss what one needs and what one provides without resorting to empty jargon? Real communications needs no jargon and does not use jargon, it modifies itself to be understood by the audience, using the audience's terms.
This is critical in the coming decades: learn to communicate, professionally communicate, and I'm not talking about being a media talking head, I'm talking about learning how to speak to anyone anywhere from any stature. It's a critical skill and it is damn well needed now as well as tenfold in our fast approaching future.
I am for example very good at math and reasoning etc. But when I write something I tend to construct long complicated sentences (probably because I think that way^^) and the result whould often be considered badly written.
Now of course you can feel superiour, for your better writing style. If it makes you happy ;)
All of those are related to language, because our thinking (and also math and logic) is based on language.
But just because we think in language, does not imply that writing is the only form of reasoning. It is ok when it is your preference and certainly has a value - like other thinks.
Writing and clear thinking are are related in at least two ways:
1. Both good writing and good thinking require structured thought, which is the ability to organize and categorize ideas in relation to eachother.
2. Both good writing and good thinking require meeting other minds: expressing your ideas in such a way that they are also comprehensible to someone who doesn't share our own brain. This is important because it's entirely possible to _think_ you have a good thought without actually having a good thought. The ability to articulate a thought is an effective discriminator between the two scenarios.
It's probably possible to think clearly without writing, if these two elements are still present. But writing well is an effective forcing function, which is why they're so closely related.
The "people are mostly fine" is true. People were mostly fine at the time when only the elites could write, but the society will not be the same. We are moving towards those old times.
Maybe you deal with intellectual elite from the best schools and haven't noticed[1], but adult literacy even among college educated has been on a downward trend for some time and we can see some results. Normal interactions in the corporate world are more difficult. People in middle-level management can't explain and articulate things as well as they used to. There is more communication, but it's not as efficient. One well-written report in every two weeks used to be enough. You need to be on a Zoom call 2-3 hours weekly for the same thing.
Mediocre or low quality writing is mentally taxing to read. If what you read is grammatically correct AI-slob it really kills all interest in reading and communicating in writing.
[1] Only about 10% of adults have PIAAC adult literacy level 4 or 5.
With language however we are attempting to refer to some kind of underlying meaning or reality, and an LLM will not give you the understanding of the things being referred to - only the outward representations themselves. If you are indeed interested in meaningless exchange of symbols a la Searle's Chinese Room then perhaps there is some utility in this; otherwise it's the act of digesting and comprehending meaning, and the internal cognitive processes that go into the production of the written artifact that matter, not just the written artifact itself.
Writing is different. It is more akin to the left leg, when the right leg is your thinking. I am a weird combination of an author and a mathematician-turned-programmer, and both these skills are very useful in either activity; I wouldn't be able to program half as well if I didn't dump my messy ideas to a doc first, especially when the model is far from straightforward.
Writing things down is a huge feedback loop for thinking. Plenty of edge cases stick their head out of the doc once you write everything down.
I wonder if some people would argue that it _was_ disastrous though?
We seem to struggle with things like putting people on the moon and not making airplanes fail compared to 50 years ago.
Maybe there was some critical mass of mental maths skills that engineers had in the 60s that we've lost now? Are we still inventing things at the same rate as before?
Not that, say, doing long division in your head isn't a valuable skill, mind.
https://en.wikipedia.org/wiki/Rubber_duck_debugging
As a person who frequently uses writing to focus my thoughts, I don’t see why one would see writing as the only way to focus their thoughts.
I'd disagree with "half" here because I can't imagine it being anywhere close to 50/50. I expect a power law distribution: most won't be able to write well. The ones who do will have a massive advantage, in the same way that those who can concentrate in our age of distractions have a significant advantage over those who can't.
We're already in a poverty of quality communicators. All that nonsense with Bitcoin was fast talking nonsense that sounded plausible. This is what happens when real communications breaks down: fraudulent technical products surrounded by a word salad of abused language and people afraid of looking stupid so they never ask for clarifications of the gobbledygook.
I recently published a [major philosophical work][1] that is the result of decades of thinking and three months of writing. I’m not a native English speaker, and although I know what I want to say, I often don’t know how to write it. I may not know or can’t find the right terms or phrasing, or I might make grammar mistakes. Sometimes, I can describe my ideas in a clumsy way, and I need help refining my sentences.
So, I use AI. I think, write my thoughts in my own way, and then work with AI to bring them closer to what I want. It’s hard work. Although AI can be an amazingly good writing partner, it often alters my text in ways that change the meaning completely. Even replacing a single word word with a synonym or adding a comma can turn a sentence into something totally unintended. It can be a lot of back-and-forth work to find the right paragraphs. Still, AI is a tremendous help, and my work would have been more immature and unpolished without it, even if it sometimes feels a little artificial.
Of course, it’s much more ideal to master English fully and to practice writing until it feels natural. But AI helps with that, too.
the infrastructure of the internet has matured enough now that we don't have to talk to each other in ASCII characters any longer. being online increasingly will mean using your voice and face (or suitable synthetic alternatives) to talk to the rest of humanity. not like TikTok though with its algorithmically driven mental corruption. and not like YouTube with its copyright oriented business model. More like early days twitter. Just everyday people talking to each other. But video. And realtime.
Does video add not a dimension of friction over sending text? For one, you can't scan a video the same way you can scan text.
The biggest obstacle to my job is when I need to convince other people of a course of action. People do not do what I ask of them, because people need a nice narrative. Indeed the reaction of others to my writing is so negative I can get into disciplinary proceedings just for writing facts. Yet, when the shit hits the fan, and people need a leader and a plan, suddenly people do what I ask and are successful, presumably looking past my communication issues because of the imminent and larger threat.
I am Autistic. I look forward to the day when my email and slack client, my JIRA client, automatically take my statements of fact and turn them into something neurotypical people won’t have a social signaling reaction to and respond to as a threat.
Also, thinking can also be doing.
And while I’m here, there’s plenty of people who can write and not think. And more who cannot think critically. Electrolytes: it’s got what plants crave.
People would start writing more and getting better at it with the help of LLM. This would create a positive feedback loop that would encourage them to write more and better. LLM should be used a tool improve productivity and quality of output. Like we use a computer interface to write faster, move and edit text, instead of using a pencil and an eraser, today you can use a LLM to improve your writing. This will help people to get better at organizing their thoughts and think more clearly instead of replacing the thinking.
If I were teaching at a university, I think I could not assign essay grades until i sat down with every student for a quick Q&A on the subject of their paper. That would reveal the plagiarists and charlatans and lame non-thinkers.
Everyone uses voice interface, so writing is looked as an antiquated habit.
What's this article about? Why should I read it?
(I read it)
This post's title does not echo some of the points in the article. Good human-sourced writing will be hard to do/find?
Well, don't perpetuate the problem with a title that doesn't tell me why to pay attention.
If AI trains on generated text it is equivalent of an incestuous relationship and I don't have elaborate on analogy further.
As opposed to the writing at one end, these thoughts are rooted in an experience, an emotion, and this association never dissapears. Add the above act on them, it is like writing.
Maybe the author is referring strictly to communication or to the art of writing.
If on the other hand, you need to generate a bunch of content for a school paper, blog or etc then its immensely helpful.
But already there's a large number of people cheating at those tasks.
So, we could also say we'll have the Thinkers and the Cheaters.
Sadly, I don't expect the Thinkers to be the Haves.
Why read a book when we can have an idea distilled in a quick infographic, a shortform video, or a pithy tweet? I love a deep dive book that lets you immerse yourself in idea and study it from multiple angles, done masterfully in Dune or Thinking, Fast and Slow.
But are we losing that chance to really contemplate given the speed at which more information is being thrown at us across every form factor?
maybe someone would have even pointed out to him from what activity the peripatetics derived their name. but alas!
He didn't need to right that article. He is already rich. So why did he do it?
Everyday hn is full of articles where people have some some amazingly complex thing, entirely for fun. Then they have written a blog about it entirely for fun.
Then we get one of the familiar detachments from reality
> In preindustrial times most people's jobs made them strong. Now if you want to be strong, you work out. So there are still strong people, but only those who choose to be.
Except for all of the people whose jobs still make them strong. Scaffolders, tree surgeons, bricklayers, carpenters, et al.
I need to write a document next week. I have begun to analyse a complex system that chatgpt will not be aware of. I need to apply my specialism to it, to decide what to do with lots of steps. Writing it will test my understanding of the system, and encourage completeness. It will allow others to know what they are expected to do. It will allow a constructive discussion of the choices and reasons. ChatGPT wont help me, except perhaps in layout, rephrasing a sentence, something like that. My job will keep my writing muscles strong. Paul Graham has lost touch with reality.
I would say that it's quite the opposite! The most prestigious the job, the more likely the person will have one or many assistants to help them write.
Think of presidents, governors and CEOs. They must *read* much more than they write. Their response can fit in a post-it attached to the paperwork.
The next level also reads more than they write. Instead of a post-it, they will probably come up with bullet points which will be fleshed out by people below them.
The people who *really* have to write stuff is the people at the *bottom* of the hierarchy.
Writing well could be a way to go up the ladder. But it is definitely not required at the top.
What will change, in the future, is that *everyone* will have assistants.
(1) Due to computer-based word processing and spelling and grammar correction, writing, and good writing, are much easier now than before personal computers (PC). Indeed, a quip is that the typewriters killed off the ink pens, and the PCs killed off the typewriters; writing got easier and likely better, not less common. People got a lot more practice.
(2) Email, Internet blog posts, and other communications generate more writing. Can we find some data on total US email volume and compare that with old USPS mail volume, letters to the editor of newspapers, etc.?
(3) Now there is a lot of competition for good writing: At Hacker News, bad writing, especially from bad thinking, gets down voted. On Web sites using Disqus, part of getting voted up is clear, short, maybe just one sentence, maybe sarcastic, clever, and humorous, say, succinct, on the point, and maybe fun, and that means in some respects better writing. Maybe Disqus could tell us how Internet blog post writing volume has increased? A lot?
(4) For the media, via the Internet and Web sites, that is now much cheaper to produce than old newspapers, magazines, TV news, and I'd guess that the total of media as writing or as oral reading of what was written is much greater than before. For the future, I anticipate many more words per day, i.e., more writing. Uh, the writing at Hacker News has been going down, up, or staying the same? There is Facebook, X, Reddit, Wikipedia. There are sites for narrow interests. Sounds like a lot more writing.
(5) People have smart phones with them nearly all the time; net, that can mean more communications, and writing is less intrusive on the receiver than in-person voice. From good STEM field communications or just mature socialization, to avoid being misunderstood or offensive, good writing is important.
(6) Sure, now some Google searches result in AI answers, and for some simple questions the AI answers can be a little okay. But, I don't take the AI answers seriously, and the old Google search facility works fine and, also, of high importance, gives the URLs for the search results.
(7) Looking back at my writing, from personal letters to academics, on-line political discussions, etc., I see no way AI could help -- the AI writing is worse, not better.
(8) Since supposedly Taylor Swift is now worth $1.6 billion, there can be increased interest in guitar playing and the claim that such music is based almost entirely on the four chords I, IV, V, and VI. So, my niece wanted to know, and I wrote her an essay, 6,200 words with several YouTube URLs with pictures and sound. For some music with a lot of chords included URL
https://www.youtube.com/watch?v=UZsqnHhyub0
Sorry, but for my niece I found nothing nearly as good as what I wrote and believe that AI would be a poor substitute.
(9) The world is changing, especially related to writing, at likely a uniquely high rate, and no AI training data can report today what is new tomorrow and needs good writing.
“It's not surprising that conventional-minded people would dislike inequality if independent-mindedness is one of the biggest drivers of it. But it's not simply that they don't want anyone to have what they can't. The conventional-minded literally can't imagine what it's like to have novel ideas. So the whole phenomenon of great variation in performance seems unnatural to them, and when they encounter it they assume it must be due to cheating or to some malign external influence.” - https://paulgraham.com/superlinear.html#f12n
There you have it folks. The genius Paul Graham is one of a select few people with the ability to have ideas, something which those who disagree with him are simply incapable of comprehending.
My takeaway: people that know how to write, that have trained that muscle, are better at thinking in a structured way and articulating their thoughts. The number of people that know how to write is declining, at least in part due to the advent of GenAI. The number of people who know how to write is still non zero and is not limited to only Paul Graham.
But there are a lot of others which never liked to write, they do not need this for their job and why should not use this GPT as a tool like the promised land of AI / robots.
Same will happen with cooking: people who like to cook will cook traditionally even after our incoming household robots will be able to.
> Instead of good writers, ok writers, and people who can't write, there will just be good writers and people who can't write.
> writing is thinking. In fact there's a kind of thinking that can only be done by writing
> So a world divided into writes and write-nots is more dangerous than it sounds. It will be a world of thinks and think-nots. I know which half I want to be in, and I bet you do too.
PG states, clear as day, that he expects the world to be divided into people who can think (him) and people who can't (almost everyone else). When I say Paul Graham imagines only he can think, this is hyperbole. I'm sure there's a small group of people with views very similar to his to whom he would also attribute the ability of thought. I am commenting on the clear and undeniable pattern of PG writing that huge swathes of the population are incapable of thinking.
https://xkcd.com/610/ about sums up my views on his attitude.
Tf