@akk I'm curious what makes you think that.
I'm kind of on the fence - there definitely haven't been any new languages in that group for a pretty long time, and I'm not sure I've seen any real breakthrough ideas...
Though throughout our short history it has seemed like as soon as it looks like language evolution is done, some new popular thing seemingly comes out of nowhere...
Career/Critical mass to me is like:
- support massive industry of freelancers: php
In that original toot even though release dates are close, there are essentially two generations of popularity:
1: Java, C#, PHP
Mostly those generational shifts happened because of how much less work is required to accomplish your work (additional layer of abstraction) Java and C# were that to a generation before it in the same way that Fortran was to Assembly
The improvements in total cost of engineering available from languages are, for the time being, tapped out.
The only promising mechanisms for reducing total cost of engineering today are
(1) super-application level, like serverless architectures
(2) more and better use of off the shelf software (again, No Silver Bullet)
(3) maybe a breakthrough in type theory.
I'd argue that the success of ruby and python is less even their language properties as their very good standard libraries (rails included) and their very good, very early package management (less so for python, but still way better than e.g. C#).
The engineering benefits of these "languages" was from their package systems and strong open source communities.
If we take e.g. Hickey and PG's writings at face value, the lisps that we have already developed already represent the pinnacle of language power and expressiveness; I think there are good reasons to believe this is true.
We've already found that it is incredibly difficult to assemble a good enough team to wield those languages (e.g. Viaweb).
The real benefits of the ruby/python era have come from "off the shelf" composable parts (again, see No Silver Bullet).
to name a few, could have all been done in any other languages previously, what was it about the early 2000's and Ruby that caused that explosion?
I'm on the fence since maybe there is something about a language that frees people up to think new thoughts.
Culturally, ruby's interest in developer ergonomics definitely plays a role. Steve Klabnik has spoken to this for Rust - when he joined Rust, and brought some rubyists with him, one of the things that he thinks carried over was a focus on dev ergonomics. It just wasn't acceptable to e.g. have cryptic error messages, if a more useful error message could be generated.
@jamescgibson You definitely see that now, though I'm not sure how much that's just looking at the situation with hindsight and trying to come up with a guiding principle.
A lot of the Ruby stuff came from _why's ideas, and I don't think he was really thinking explicitly about ergonomics rather than trying to entertain and challenge himself and others. Camping inspiring Sinatra for example
@scottwerner That's probably true, but maybe I just see "developer ergonomics" as the corporate speak for "making programmers happy". I think pursuing joy in programming has always been central, from what I've read of Matz.
Maybe "developer ergonomics" is just how you sell joy to your boss.
@jamescgibson ah yeah, MINSWAN and all that. You’re right, I guess it was a guiding principle from day 1. DHH probably points to that idea too.
@scottwerner The argument from the paper really lays it out - for software, there's fundamental and incidental complexity. Fundamental complexity cannot be abstracted away.
If you the programmer aren't spending >=90% of your time on incidental complexity, even eliminating *all* incidental complexity will not lead to an order-of-magnitude improvement.
I'd venture that if you're already using a modern system, like Rails, <90% of programmer time is spent on incidental complexity.
@scottwerner Now, to connect this to conversations we've had in the past - I think there *is* still an argument for redefining what is incidental complexity - preventing users from requesting that the gas and brake pedal be a different distance apart.
But that's changing the user's / customer's expectations, not building the same system faster - which is still a valid solution, just outside the scope of "No Silver Bullet"
@jamescgibson I guess I'll have to read the paper, but from the wikipedia summary:
there is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity, in reliability, in simplicity.
Being written in 1986, I feel like we've seen it proven wrong two decades in a row since the paper was written, with those two generations from the original toot.
@scottwerner The paper is definitely worth a read. I'd venture that we have experiences one 10x improvement, but I'm not sure of two.
It of course depends on how "speed" is measured here as well.
@scottwerner Apparently he wrote some reflections in the newer editions of the Mythical Man Month, so I guess I'll have to go track down a copy
@scottwerner Actually, arguably, he was right: even if you count Java and C# as successes, Java was only 1.0 released in 1996, 10 years after the paper was published - C# not until 2000.
The C2 Wiki discussion is good - at least one comment does list Java and DotNet as silver bulletrs
@jamescgibson Ok, but if we're going to count java as a success there are we now debating if the silver bullet happens after 12 or 13 years instead of the original 10?
@scottwerner Yeah haha, sorry I was just being cheeky.
I'd agree between the two generations you mentioned we definitely have a 10x speedup. Not sure about 100x.
And, to be fair, I've always thought 10x was silly; a 2x improvement would be amazing - and I definitely agree the two generations you mention each brought 2x, as did cloud.
@scottwerner Still, today, looking at my work: I probably spend 60% of my time dealing with fundamental complexity, even if that fundamental complexity is only fundamental because of customer requirements. So I'm not hopeful for even another 2x, in the field I'm in now.
For greenfield software, I think there are good opportunities for 2x speedups. Still skeptical about 10x.
@scottwerner To reconnect to the original tweet - I think any practice that can deliver 2x speedup has a good chance of becoming "mainstream" (for lack of a better term) and widespread.
I'm not sure any new language can do that for general problems.
@jamescgibson :) yeah though that's I think the question I have - why would this time be different?
Have we really gotten rid of all the incidental complexity? I'm not convinced we have.
I'd be curious to see what you're working on and what makes it different from things I've worked on in the past.
@scottwerner School SIS system integrations - so lots of what was incidental decisions for the original author of the software we're integrating with becomes fundamental complexity for us.
Really, though, probably 50% of our team's time is spent on external communications and requirements gathering - no language is going to solve that.
The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!