Tuesday, June 29, 2021 [Tweets] [Favorites]

GitHub Copilot

Nat Friedman:

We spent the last year working closely with OpenAI to build GitHub Copilot. We’ve been using it internally for months, and can’t wait for you to try it out; it’s like a piece of the future teleported back to 2021.

GitHub (Hacker News):

With GitHub Copilot, get suggestions for whole lines or entire functions right inside your editor.

[…]

GitHub Copilot is available today as a Visual Studio Code extension. It works wherever Visual Studio Code works — on your machine or in the cloud on GitHub Codespaces. And it’s fast enough to use as you type.

[…]

GitHub Copilot works with a broad set of frameworks and languages. The technical preview does especially well for Python, JavaScript, TypeScript, Ruby, and Go, but it understands dozens of languages and can help you find your way around almost anything.

inimino:

Cargo-cult programming has always been a problem, but now we’re explicitly building tools for it.

fzaninotto:

I’ve been using the alpha for the past 2 weeks, and I’m blown away. Copilot guesses the exact code I want to write about one in ten times, and the rest of the time it suggests something rather good, or completely off. But when it guesses right, it feels like it’s reading my mind.

It’s really like pair programming, even though I’m coding alone. I have a better understanding of my own code, and I tend to give better names and descriptions to my methods. I write better code, documentation, and tests.

Francisco Tolmasky:

I think one reason things like GitHub Copilot don’t resonate with me is that when I do want code written for me, it’s for an interesting enough problem to merit a library (like CodeMirror). I just don’t find myself in these glue code purgatories that these demos aim to eliminate.

IOW, I feel like I’ve been living the dream of “having the hard stuff done for me” for ages now. It’s called, ironically enough, @github and @npmjs … and it’s awesome! It’s often thoughtfully encapsulated in a nice API, as opposed to copy/pasting from stackoverflow… at scale?

Update (2021-07-02): Feross Aboukhadijeh:

I’ve been testing #GitHubCopilot in Alpha for the past two weeks. Some of the code suggestions it comes up with are eerily good.

Here’s a thread with some examples that I found surprising.

Alexey Golub:

Gonna spend an entire today working in VS Code w/ #GitHubCopilot today. Curious to see where this will take me✨

For starters, here it was able to infer the usage of CliWrap from previous lines and apply it to solve an entirely different problem. All based on a single comment 🤯

Marcel Weiher:

As far as I can tell, it’s an impressive piece of engineering that shouldn’t exist. I mean, "Paste Code from Stack Overflow as a Service" was supposed to be a joke, not a product spec.

Matt Diephouse:

GitHub Copilot could definitely be helpful while implementing a custom Collection in Swift.

Or an Encoder or Decoder.

Or to work with Strings.

Maybe Copilot could provide analytics about which APIs people find difficult to use and what operations they want to perform?

See also: Dave Verwer, Hacker News.

Update (2021-07-06): Patrick McKenzie:

I’m probably more bullish on this product than my model of most programmers. Contrary to naive expectations, it doesn’t decrease demand for programmers; it probably decreases unproductive time of junior programmers stumped by the “white page problem.”

For many years folks, often non-technical, have mentioned tauntingly “Wait until you automate programmers out of a job” and that was the exact opposite of what happened when we introduced cutting edge AI like compilers and interpreters to liberate programmers from programming.

20 Comments

Interesting that this kind of thing is possible, but is it useful? Do I wish I could type a little faster yet? Sure. That I would have to type less boilerplate? I guess.

Have I been clamoring for a tool that writes the boilerplate for me? No, not really. No, I don't want my text editors full of code no human ever wrote. I want less code total, not less code that has been handwritten.

A tool that would be far more useful (for me), but look less impressive on a demo: one that, like this one, let's me start writing a method header, then finds a library package (SwiftPM, NuGet, whatever) that returns results to match those requirements, and lets me confirm to reference that.

Old Unix Geek

I really dislike this.

People spend their free time mastering some topic, and give it to others as Open Source. Then Microsoft comes in and "trains" an ML system using that open source to displace the people who mastered those technologies.

It reminds me of the bad old days when some of us would invent a new way to make the limited computers of the time do something that no one believed could be done. Then some punk kid would steal our code, and without even understanding it would claim it as his own and sell it.

It's obvious that "training" an ML system is not a creative activity, but a form of copying. GPT-3 outputs entire sentences directly lifted from its training corpus. This is the same. And it's obvious that now that a computer can "program", the value of having mastered these things will be quickly fall to zero in people's minds.

That's called theft: taking that which was not offered. "Fair use" is just a convenient excuse. Fair use for criticism isn't the same as taking someone else's work and reusing it. Most fair use in ML is for training a classifier (oh look, there's a dog there). This generates code, it doesn't just recognize what it does. If this is legal, so is training a neural network to reproduce a movie frame by frame. Of course all the benefits will flow to a few at the expense of the many who spent time and money to learn things.

What's even more amusing is that even though these fields will be undervalued, the people who made it possible will still be asked to fix things that are wrong. They just will be expected to fix it for less remuneration. That means no new people will enter the field. Eventually the whole thing will be unmaintainable, like the French telephone system, where no new "engineers" know how to fix the old analog system that most of rural France is connected to. So Orange has to employ the old engineers to fix their network, but they do so by subcontracting the work, so that the old engineers get paid a whole load less. As they retire, universal access to the network dies.

Of course, any security holes in the training corpus will be spread all over the place. And the cheap generic copy-pasta engineers pressing tab won't understand why. So that's great too.

Overall, another win for those who think everything can be deskilled, as long as it's cheaper. But it is incredible Microsoft thinks they can do this, so soon after Oracle and Google fought over whether Java's API is copyrighted.

I've always felt we needed fewer, but better trained engineers working on a few useful things. Instead this sort of thing guarantees that the innumerate pink haired weirdos who can't spell, and never were corrected at school, for fear that their self-confidence might suffer, end up writing most of the awful software yet to be produced this century. Enjoy it as we move to "3rd world" levels of quality where turning on a light in a 5 star hotel causes the entire light fixture to fall off the wall, and no one is the least bit surprised.

Kevin Schumacher

GPT-3 also advised a simulated patient to commit suicide when it was tested as a medical chatbot for mental health issues, so it's clearly doing something more than just parroting back training input.

Your rant about theft is completely off-base, as is your general theme of targeting millenials for everything that's wrong with society. (For the record, I'm not a millenial, in case you think I'm offended for being the target.) You also seem to have a pretty significant misunderstanding of what machine learning is.

Specifically regarding "theft" (which is not what is happening when you are respecting the license attached to the code being used, as was done here), GitHub notes about a 0.1% regurgitation rate, and is adding the ability for the system to tell you when something is actually a verbatim match. You can read for yourself how that figure was arrived at. https://docs.github.com/en/github/copilot/research-recitation

Some of your other points may be more salient, but they're lost amongst the shock-and-awe words (theft! copying! steal!) and curmudgeon mentality.

Old Unix Geek

Since I work on ML, I think I know whereof I speak. But thank you for the "education", random person on the internet who has read a marketing web page.

I hate to break it to you, but not all Millenials have pink hair. Do you feel insulted because you have pink hair? Or did you bring up a child who opted to match my description? In either of these cases, you might not have improved the world.

Many Millenials are well educated. Some are even a delight to work with, just is true of any other generation. It's a particular type of poorly educated self-confident, indeed arrogant person I don't like. They're often found militating for social change rather than technical excellence. And despite what you say, the fact that their egos were saved by avoiding correcting their mistakes moslty reflects badly on the people who brought them up: their teachers and parents. Those are the generations that preceded them, not them. I'm surprised that this isn't obvious to you.

Similarly, I very much doubt that anyone who wrote code and put it on github expected it to be used in this manner. I certainly didn't. The fact this tool generates code rather than recognizes it moves it outside the domain of fair use as far as I am concerned. Having your own work taken and used to compete against you strikes me as theft, to the same degree as violating copyright is theft. Unless you explicitly put code in the public domain, it remains copyrighted. Making an AI recognize dogs from copyrighted pictures of dogs does not seem like a copyright violation to me: the photographer is unharmed by the action. On the other hand generating photo-realistic pictures of dogs that could compete with dog photos for sale would be.

I also never suggested this tool always quoted code verbatim. But that's no protection from copyright violations. If it were, people could take GPL'd code and rename a couple of variables and function names. A better measure is whether the compiled code is significantly different. For instance, many GPL license violations are found by looking at the trace of system calls through the program: it's not good enough to rename variables, the structure of the code must be different. I'd expect this tool to cause this type of violation.

Indeed, I wouldn't expect this tool to generate quoted code verbatim most of the time. It will sometimes, but that's not how you design ML systems. You essentially set them up to separate independent features, such as variable names. Then when you're generating output, it is unlikely the same variable name will be chosen again. So direct quoting would be expected to occur rarely.

I don't like this tool as it moves power from programmers to Capital. Because it is more efficient there will be little choice as to whether to use it or not. But it will continue the tendency of replacing good code by mediocre "good enough" code. Expect even more bloated applications. It will also change the dynamics of code creation for the worse. It will however benefit capital and therefore will be adopted. Among other things, the result will be a further reduction in social mobility for those who were born bright but to the wrong parents. It will also reduce innovation because, just like frameworks, it will narrow choices to what has been done before. This is not good. I don't see much upside from this "innovation".

Anyway thank you for your poor reading comprehension, your preconceived opinions, and your gratuitous insults... Always a pleasure to read.

I feel like complaining about this is a bit like complaining about code completion in general. We had these same discussions around 2000: oh no, Visual Studio makes writing code too easy, we need to keep the young people out, their hair has the wrong style and there are too many of them, and some of them are even women! Our wages will go down! We'll have to shower! The humanity!

If you don't want to use it, just keep writing code in vi, or whatever tool was cool when you were young. It's fine. Everybody else can benefit from this.

Then, in twenty more years, it'll be our turn to complain about how GitHub Codepilot is the god-given correct way to write code, and all of these fancy new augmented brainjack coding tools are a thoughtcrime and must be abolished.

@ Plume: I do understand that complaining about this has vibes of gatekeeping, but nah. Code completion saves you keystrokes on code you would already know how to type. (Well, you might get the spelling wrong, but you have a rough idea of what code you're writing.) This feature seems to outright implement entire algorithms for you — while, according to your VCS history, making you the author.

> If you don't want to use it, just keep writing code in vi, or whatever tool was cool when you were young.

Sorry, but that's exactly as reductive a take as you're accusing others of.

Old Unix Geek

@Plume

I think this is a much bigger change than code-completion. It will reduce the amount of thought going into software. Not remembering how to spell a function name does not impede the main work understanding how best to express a problem. This will, since code-autocomplete will force people down beaten paths.

Also, this tool will work best on popular languages such as JS, and the lack of it for other languages which might be better for actual engineering will make using those languages even less appealing to management. That too changes the world for the worse.

But ultimately I think it solves the wrong problem. Writing code is a lot easier than reading and understanding it. But this tool will help those who are not even able to write it fluently... leading to vast amounts of uncomprehended source code. I once took over code that worked, but made no sense. The author of it had clearly tinkered with it until it did what he wanted. I had to rewrite a large part of it to get rid of all the security flaws and infinite loops lurking in it. That was over two decades ago. My manager never managed to convince his bosses why this rewrite was needed, and why I saved the company a lot of grief.

What I'd like are tools that help me understand unfamiliar code bases faster. Unfortunately that's a much harder problem to solve than generating code.

I don't much care for your caricature. I've only ever met one guy who might fit it. Of course women code. My wife codes. Unfortunately "hair-style" correlates really well with a certain kind of attitude. Blue haired weirdo or pink haired weirdo is a no doubt clumsy shorthand for that attitude. If you have a better shorthand, I'm all ears. I'm not sure why this correlation exists. Perhaps you can enlighten me on that topic. The only stereotypes you did get right about me is that I use vim (because it works on almost every platform I've used unlike emacs), and I don't use code-completion because without repetition my mind doesn't learn function names properly. Once I've learned APIs, I type fast enough for it not to matter.

Kevin Schumacher

Feel free to quote my gratuitous insults, because there were none, unlike in your rants. But it’s clear you have zero interest in actually discussing anything honestly and more interest in contradicting yourself (“GPT-3 outputs entire sentences directly lifted from its training corpus. This is the same.” “I also never suggested this tool always quoted code verbatim.”) and flinging mud about people you don’t like (“Instead this sort of thing guarantees that the innumerate pink haired weirdos who can't spell, and never were corrected at school, for fear that their self-confidence might suffer…” ironically followed by, in a later comment, “I don't much care for your caricature”), so I’m not even going to bother trying. Meanwhile if you have anything valid to say, no one would ever know it.

Old Unix Geek

@Kevin Schumacher

"This is the same" meant, just like GPT-3, it can quote code verbatim. Indeed it can as you pointed out. Perhaps I could have said so more explicitly.

I'm surprised you consider "rant", "off-base", "targeting millenials", "you have a significant misunderstanding of ML", you have a "curmudgeon mentality", "you have zero interest in discussing anything honestly", "if you have anything valid to say" to be compliments. They seem like insults to me.

Feel free to tell me what you would call the people you say I describe so badly. Ignorami? I admit freely that I do not like working with people whose expertise, if I were to buy it for what I estimate it to be worth, and if I were to sell it for what they estimate it to be worth, would make me a tidy profit. Does anyone?

Numeracy, ability to think, spelling, should be default requirements in professional engineering circles. Of course I'd make reasonable exceptions for issues such as dyslexia. How very old fashioned of me to have standards. Again I don't see why you take this badly. The lady doth protest too much, me thinks.

For those of us who can read fast, bad spelling is highly annoying. To get a taste of why, please read the following:

A PLAN FOR THE IMPROVEMENT OF ENGLISH SPELLING
By M.J. Yilz

For example, in Year 1 that useless letter "c" would be dropped to be replased either by "k" or "s", and likewise "x" would no longer be part of the alphabet.

The only kase in which "c" would be retained would be the "ch" formation, which will be dealt with later.

Year 2 might reform "w" spelling, so that "which" and "one" would take the same konsonant, wile Year 3 might well abolish "y" replasing it with "i" and iear 4 might fiks the "g/j" anomali wonse and for all.

Jenerally, then, the improvement would kontinue iear bai iear with iear 5 doing awai with useless double konsonants, and iears 6-12 or so modifaiing vowlz and the rimeining voist and unvoist konsonants.

Bai iear 15 or sou, it wud fainali bi posibl tu meik ius ov thi ridandant letez "c", "y" and "x" -- bai now jast a memori in the maindz ov ould doderez -- tu riplais "ch", "sh", and "th" rispektivli.

Fainali, xen, aafte sam 20 iers ov orxogrefkl riform, wi wud hev a lojikl, kohirnt speling in ius xrewawt xe Ingliy-spiking werld.

Drunken Dogcow

Nobody else is concerned about the privacy implications of streaming their codebase in real-time to Microsoft's servers?

Old Unix Geek

@Kevin Schumacher

Now I get what you're saying. You thought that by stating "GPT-3 outputs entire sentences directly lifted from its training corpus." I meant it only does that. I didn't. I meant it does that sometimes. As in "Kevin kicks a ball" doesn't mean that's all he does. Of course GPT-3 does more than that. I assumed that was obvious. But you're right, my message wasn't clear. Sorry for that confusion.

"I think this is a much bigger change than code-completion. It will reduce the amount of thought going into software."

I'm not necessarily saying that it is the exact same thing. What I *am* saying is that people are making the exact same arguments against it. "It will reduce the amount of thought going into software" is exactly what people said about Intellisense.

As far as I can tell, this thing basically does one thing: problems people now solve by going to Google and copying something from SO will be solved by just inserting the code directly into the editor. That's it. It saves a bit of time.

People will still have to actually look at the code, and adapt it to their use case. They'll still need to solve the unique problems from scratch. Just like they do now.

Typing text isn't what makes programming hard, so removing the need to type text in some cases won't be what makes it trivial.

"I don't much care for your caricature"

I would like to write something pithy here, but it's just odd to me that you're so quick to write these incredibly demeaning posts that basically discount a whole generation of developers, and then even quicker to complain when you feel that people respond to you in kind.

I should also note that the comments I made weren't even about you, they were actual things I heard in the 2000s (minus the "now I have to shower" thing, but actually including the "women joining the field are destroying software engineering" thing). I was pointing out that this is the same kind of "the next generation is the worst generation" thing that happens all the time, in all areas of life, and in 99.999% of cases, it's bs. Kids are eating Tide pods! Kids are planking too much! Kids are using AI to write code for them! Kids are playing violent video games! Kids are praying to Satan when they role play! Rock 'n' roll music is making kids' clothes too tight!

Don't worry, the kids will be okay.

It's now just your turn to be the person complaining, instead of the person others complain about.

>What I *am* saying is that people are making the exact same arguments against it. "It will reduce the amount of thought going into software" is exactly what people said about Intellisense.

Be that as it may, just because skepticism has been applied to different technology and turned out to be unfounded before doesn't mean it'll be unfounded for this particular technology.

>As far as I can tell, this thing basically does one thing: problems people now solve by going to Google and copying something from SO will be solved by just inserting the code directly into the editor. That's it. It saves a bit of time.

The hypothetical scenario here is obvious, though: things that you might have previously thought about and/or tried for a while before realizing "I don't know how to solve this; I'm going to do some research" are now a keystroke away.

It's nice that you can type "send_tweet_with_image" and get a working implementation. But it gives the illusion that this solves all your problems. Is that code bug-free (would there be fewer bugs if you had implemented it yourself? Or more?)? Is it secure? What about your privacy policy? What are its performance characteristics? And many more.

IntelliSense is nothing like that. There's a difference between calling an API and implementing a method body.

> People will still have to actually look at the code, and adapt it to their use case.

Yes, but maybe they won't.

If I have to start something "from scratch" (obviously, no code these days is actually written from scratch), I'm forced to put much more thought into difficult questions. If the code is already "there", and might need minor justifications, I might not bother.

If you write a memo yourself, you'll think about it a lot more than if your assistant does and asks you to merely review it.

>Typing text isn't what makes programming hard, so removing the need to type text in some cases won't be what makes it trivial.

That's absolutely true. But this doesn't reduce typing. IntelliSense reduces typing. This reduces thinking.

> they were actual things I heard in the 2000s (minus the "now I have to shower" thing, but actually including the "women joining the field are destroying software engineering" thing).

Alas, that last one can still be heard today, in 2021.

>Kids are eating Tide pods! Kids are planking too much! Kids are using AI to write code for them! Kids are playing violent video games! Kids are praying to Satan when they role play! Rock 'n' roll music is making kids' clothes too tight! Don't worry, the kids will be okay.

Again, though, brushing all concerns with the same stroke is facile.

(And I enjoy video games, but the jury is still out on whether first-person shooters are a) damaging, b) neutral, c) actually useful to control aggressions, particularly for the underdeveloped brain of a teenager. I would love to live in a world where we have plenty of data on just how swear words and gory violence and depictions of sexuality affect children, but we don't; there's still a lot of questions to that. There's some bigotry and fear of change to it as well, absolutely, but also, a lot of unknowns.

Many fears about technology are entirely unwarranted. Some of them will sort themselves out by improving on the technology. But some are also simply correct, and the tech turns out to be bad. I'm sure there were skeptics about putting asbestos in walls long before there was sufficient research that, yes, actually, that was a terrible idea and we need to stop doing that.)

"Again, though, brushing all concerns with the same stroke is facile."

I don't think so. I think at some point, you need to accept that your complaint is part of a pattern that repeats itself ad nauseam, and has been proven wrong each time. It's a natural pattern of how humans think of the generation that comes after them, and it has more to do with concerns about social change than with actual factual issues.

"And now, since you are the father of writing, your affection for it has made you describe its effects as the opposite of what they really are. In fact, it will introduce forgetfulness into the soul of those who learn it: they will not practice using their memory because they will put their trust in writing, which is external and depends on signs that belong to others, instead of trying to remember from the inside, completely on their own."

That doesn't mean that you're definitely wrong, it just means you need better evidence to support your point, because I feel like the odds aren't on your side.

"the jury is still out on whether first-person shooters are a) damaging, b) neutral, c) actually useful to control aggressions"

True, but that's kind of besides the point. We *do* know that any effects are difficult to detect, despite extensive research, which means that they must be small. So the moral panic caused by violent videogames was not warranted. Kids playing a lot of Doom in the 90s didn't cause as surge of violent behavior.

Likewise, I'm not saying that this doesn't change how people write code. I'm just saying that the concerns are vastly overblown. This will help a lot of people. It will make some people's code worse, but that's okay, too. Things will be just fine.

Old Unix Geek

@Plume

TIL: criticizing a certain type of programmer is criticizing an entire generation or two.

Perhaps try to understand what I said? At least try to strongman it? Think?

But, to your point, you have no basis other than induction to say the kids will be ok. Just like a turkey the day before Thanksgiving. In fact there are many reasons to believe they won't be. And it will mostly not be their fault: the bedrock of our civilization is collapsing.

For instance, germane to this discussion, one of the keystones of our particular civilization is property rights/contract law. Niall Fergusson calls it a "Killer App". It's what people in the West rely on instead of the trust of family/tribal ties. (see the book "the WEIRDest people in the World") But, what with patent trolls, lawsuits over APIs, repeated plagiarism such as produced by this tool, civil forfeiture by the "lawful authorities", the growing power of Apple over independent developers, etc, it is clear that property laws are not what they once were. If you ever write a book for a large publisher, you'll discover that reusing a single sentence from another book, even it is a technical note written to encourage the use and purchase of computer chip, is considered plagiarism. Everything you write is scanned to see if it matches an existing work.

Another keystone is meritocracy. Many schools now say it's evil. But it's what has let poor but intelligent working class kids do better than their parents and rise in society. Despite the UK being portrayed as a class society, it was one of the places that had the highest levels of social mobility at the time, which is one of the reasons the Industrial Revolution and the British Empire could happen there.

The new utopian dream is that all wealth will be redistributed to all, à la Star Trek. The USSR and China both tried to impose that idea by force. It didn't work out, as I saw for myself after the Cultural Revolution. The historic precedent is that we end up with some form of feudal system. I'm not a fan of that outcome.

Another keystone has been growth. Our monetary system depends on it if interest is to exist. Well, growth has stalled. Very little new has been invented since the 1960s. One might not notice in IT, since Moore's law has enabled many old ideas such as neural nets to be realized, but it's very noticeable in Physics for example. Energy produced is flat and EROI is falling. (Energy Return on Investment: how much energy it takes to extract new energy) The very wealthy are currently harvesting what excess wealth is still lying around. (see Steve Keen's work). The middle and working classes are getting squeezed.

Lack of growth is similar to lack of heat in a physical system: there are fewer degrees of freedom. That means fewer opportunities for people to try out new things, and gain resources. Historically, in no-growth environments there was no monogamy: Rich men had many partners. Poor men had none. This could be our long term future, and something that's already happening in China today for example.

Oh well, so much for our civilization. What about moving some other place? Well, a keystone of all civilization is a stable climate. It only has occurred in the last 10,000 years which is when the climate was stable, and large scale agriculture is possible. That too is changing. The heatdome in the Pacific North West and Canada, the 50 degrees C in Siberia, the heatwave in India, the melting of arctic ice aren't a fluke. Worldwide emissions of methane are up suggesting that the clathrate gun is primed. James Hansen is predicting fewer than 500 million people on the planet by 2100. And, if you're thinking of escaping further, Mars doesn't even have a climate.

Our civilization is rather like that condo in Florida. Still standing while its foundations are severely weaken, until one day it suddenly collapses.

There is one thing we could do to restart growth and potentially fix climate change: develop fusion with more alacrity than we have been doing so far. And use that energy for carbon capture. (If energy is released by making CO2, it takes energy to break up CO2). Unfortunately fusion is not something one can tinker with at home in a spare room, unlike steam engines or petrol engines. That's a real bottleneck to invention.

Some of you might say I'm ranting again. If so, your definition of "rant" is whatever does not conform to your world view.

>I don't think so. I think at some point, you need to accept that your complaint is part of a pattern that repeats itself ad nauseam, and has been proven wrong each time. It's a natural pattern of how humans think of the generation that comes after them, and it has more to do with concerns about social change than with actual factual issues.

I really don't understand why you keep bringing up generations. This is about increased use of technology. And yes, sometimes technology is misguided. Like, say, when it takes the climate out of wack so much that a town in Canada burns down. That sorta thing.

As a trendline, technology keeps getting better. But it doesn't follow that every piece of it is good.

>True, but that's kind of besides the point. We *do* know that any effects are difficult to detect, despite extensive research, which means that they must be small.

It doesn't mean that at all. It could also mean that our research has been poor, such as because we haven't found a good way to define a control group. Or that effects take years or decades to manifest.

(I'm not making arguments one way or another. I've played my fair share of egoshooters, including as a preteen. Did that have positive, neutral, or negative effects? We don't really know yet.)

>Likewise, I'm not saying that this doesn't change how people write code. I'm just saying that the concerns are vastly overblown. This will help a lot of people. It will make some people's code worse, but that's okay, too. Things will be just fine.

I'm not saying this will be the end of the world.

I do, however, think that it's worth discussing in a thread about, well, this very topic.

>Another keystone is meritocracy. Many schools now say it's evil.

Do they, though? Or is it rather that they're saying it's mostly horseshit, since whoever defines the extremely vague concept of "merit" is likely biased?

If we could define, objectively, who has the most merit, then of course those people should rule over that. But we can't.

Old Unix Geek

Do they, though? Or is it rather that they're saying it's mostly horseshit, since whoever defines the extremely vague concept of "merit" is likely biased?

To some extent I agree with that, if by "meritocracy" you mean what the rich and powerful often claim as their justification for being wealthy. Or even what the politically astute and well-connected claim as reasons for their success.

I should probably have been more precise: by meritocracy, I mean the most competent should be given more power/resources/time of day. Privileging competence over birth made the modern world. That is the closer to the original meaning which was coined in a book of speculative fiction that foretold of its downfall. ("The Rise of the Meritocracy" by Michael Young)

If we could define, objectively, who has the most merit, then of course those people should rule over that. But we can't.

This is where I disagree, assuming that by merit you mean competence.

Competence can be measured objectively. For instance, there is one correct answer to a math problem. There are many ways of getting to it, and a competent teacher does not mark down a student who came to it another way. However, there are usually infinitely many incorrect answers. Similarly, there are usually not that many ways to spell a word correctly, but many ways not to. Some physicians save their patients, others don't. Some car mechanics fix cars for a long time, others cause more problems.

However society needs to maintain a consensus that competence is important. It seems we are losing that. Indeed it now even seems fashionable to debate whether objective-reality exists. (Hint: hungry and sick people don't ask themselves such questions.)

For instance, parents might be delighted that more of their children now graduate from high-schools. Politicians reap the rewards of more votes. However parents shouldn't be happy if their child's graduation was achieved by lowering standards. (e.g. Physics A levels in the UK are much less demanding than they used to be). If more people graduate, by reducing standards, graduation means less, and a bright young person will need to spend even more time and money to distinguish him or herself as competent. Only if the level is actually improving (say in comparison to other countries), should parents be pleased and conclude that the education system is working.

Similarly, it is the job of a media that works to serve as society's memory. If someone keeps getting things wrong, they shouldn't be invited again and again as if they were still an "expert". Yet they are. If a politician does the opposite of what (s)he said (s)he would, that should be pointed out to the audience (everyone who is too busy to pay attention). Yet it rarely is. The longer term consequences of every decision need to be pointed out so that politicians don't choose the easy way of gaining votes among members of the public who cannot or do not have the time to think more than one step into the future. The media rarely does this anymore.

There's an interesting book by "Theodore Darymple" (a pen name) called "Life at the Bottom" about the British underclass (the mostly "white" working class), and about how their chances at a better life are being under-served. It matches what I saw in the North of the UK, and I find it very depressing. A friend of mine climbed his way out of that milieu despite all the barriers thrown in his way. He was able to do so only because he was a brilliant mathematician, and because his competence could be evaluated. If one were to take that away from him, he would have been a bricklayer, and that would have been a loss to Science.

"TIL: criticizing a certain type of programmer is criticizing an entire generation or two."

I can't read your mind, but I can tell you that your rants are indistinguishable from the rants of somebody angry at the entire next generation. If it is indeed true that you are not, then it would be helpful if you took better care to communicate your points with more precision.

"you have no basis other than induction to say the kids will be ok"

That's a bit like claiming that I have no basis other than induction to say that the sun will rise tomorrow. It's technically true (while I also have other bases, induction is definitely the primary one), but it's an incredibly weak argument. Since your point is that the sun *won't* rise tomorrow, it seems to met hat the onus is on you to provide better evidence.

"Another keystone is meritocracy. Many schools now say it's evil"

I feel like your representation of people's points is often so far removed from their actual points that I find it difficult to tell whether you are making good-faith arguments. I've never seen anyone say that "meritocracy is evil." I have, however, seen people say that the systems we have traditionally called meritocracies were, in fact, systems that primarily elevated white men, at the expense of everybody else, largely regardless of merit.

"Some of you might say I'm ranting again. If so, your definition of "rant" is whatever does not conform to your world view."

The only reason you would end your rant on this note is that you actually recognize that it is a rant, though :-)

"I really don't understand why you keep bringing up generations"

Because the arguments against this feature fall so perfectly into this pattern.

Honestly, some of these arguments seem ludicrous to me. The line of argumentation that goes from "contract law is a keystone of our civilization, this tool produces plagiarism, therefore it causes grave harm to our civilization" is so odd that I find it impossible that you would form an opinion on this tool based on this line of thought. Each chain in this argument is wrong in obvious ways.

So it seems much more likely that the opinion was there first, and this argument is a result of confirmation bias, where the preconceived notion that the tool was bad lent validity to this argument, rather than the other way around.

(If I misrepresented the argument, I'm sorry. I did not intentionally do that, but I might have misunderstood the actual point.)

If the opinion was there before the argument, then the most plausible explanation for this is precisely this kind of generational moral panic.

"We *do* know that any effects are difficult to detect, despite extensive research, which means that they must be small."
"It doesn't mean that at all."

Yeah, it actually does. This is an area that has been extensively researched. It's been researched for more than three decades now. For similar topics, we were able to detect even minute effects. Here, we are unable to detect any kind of long-term effect. Even researchers who believe that games have negative effects only show short-term aggressive behavior, which is indistinguishable from the kind of behavior people exhibit any time they are under stress.

This definitely means that, if a long-term effect exists, it is small.

It also definitely means that the people predicting all kinds of large-scale negative effects in the 90s and 00s were wrong. None of the things these people were afraid of actually happened.

The kinds of arguments people used to make about violent videogames ("the gravest assault on children since polio") are very reminiscent of some of the arguments people are making in this thread ("this will reduce innovation", "the result will be a further reduction in social mobility for those who were born bright but to the wrong parents").

The actual end result will be the same: things will be just fine.

The thing that will actually kill civilization will be global warming, and that's not the kids' fault. That's something to remember the next time anyone feels the need to complain about how Millennials learned the wrong things in school: if you're complaining about Millennials, you probably belong to the generation that *actually* ended the world.

Old Unix Geek

@Plume

You're projecting a lot... it's a kind of self-reaffirming blindfold. It won't serve you well in life. Been there, done that. I've found it more useful to try to falsify my own opinions.

I'm guessing you must be the sort of Millenial who puts everyone older than them into a box and thinks they are all the same and have nothing to teach. As such, it's easy to think all the older generations hate you. But that's a blindfold. If I regret that the education system is failing people, it's precisely because I think those people could do so much better. If I thought they were only irredeemable morons, there would be no reason whatsoever to care.

I also care that the only societies in history that have provided freedom to all, the societies that banned evils such as slavery, continue. Please learn about other civilizations' history before you judge your own cultural river so harshly and ungratefully. And please go live in other cultures. It's easy to criticize when you live better than royalty did 300 years ago. That incredible level of wealth and technology was handed to you by your predecessors. One has to be incredibly blind not to see this. The question should not be resentment at the past, but what you can do to make the world better for yourself and those who will follow you. Education is the key to maintaining this culture. There is no guarantee that the values Western civilization will survive. China is doing a great job at dismantling them in Hong Kong right now. I doubt you'd be happy there.

Climate Change was known of but was purposefully hidden long before the Millenials were born. One cannot blame the purposefully disinformed for not knowing things. Had the extent of climate change not been hidden, I, for instance, would have made different choices. I should have done a PhD in Nuclear Fusion instead of AI.

A lot of things Millenials think are unique to them predate them. "Digital Natives"... Nope. Many older people were brought up with computers too. The hand-wringing about evil computer games? Nope, that dates back at least to the late 1970s. Millenials were however brought up with the internet, and that seems to have been somewhat of a poisoned chalice.

To rant one has to be angry. I'm not. If anything I'm depressed. It's all going sideways much faster than expected...

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment