
Look on the bright side. The software we're building is longer-lasting than the creative labours of cake decorators. :P More seriously, the tools themselves are just part of it. An increasingly large component is *how* we're using the tools, which involves all the team dynamics, the politics and financial considerations of development projects, the sharing ecosystems, the market and so on. All this makes me laugh as I think back to the late 1980s, when the business analysts were rubbing their hands in glee in anticipation of "Computer Aided Software Engineering" tools, that were about to help them get rid of the need for all those "troublesome software developers". Cheers David On Mon, 4 Jan 2021 at 09:57, Peter Reutemann <fracpete(a)waikato.ac.nz> wrote:
'Long-time programmer/researcher/former MIT research fellow Jonathan Edwards writes a blog called "Alarming Development: Dispatches from the User Liberation Front."
He began the new year by arguing that software "is eating the world. But progress in software technology itself largely stalled around 1996." Slashdot reader tonique summarizes Edwards' argument:
In 1996 there were "LISP, Algol, Basic, APL, Unix, C, Oracle, Smalltalk, Windows, C++, LabView, HyperCard, Mathematica, Haskell, WWW, Python, Mosaic, Java, JavaScript, Ruby, Flash, Postgress [sic]". After that we're supposed to have achieved "IntelliJ, Eclipse, ASP, Spring, Rails, Scala, AWS, Clojure, Heroku, V8, Go, React, Docker, Kubernetes, Wasm".
Edwards's main thesis is that the Internet boom around 1996 caused this slowdown because programmers could get rich quick. Then smart and ambitious people moved into Silicon Valley, and founded startups. But you can't do research at a startup due to time and money constraints. Today only "megacorps" like Google, Facebook, Apple and Microsoft are supposedly able to do relevant research because of their vast resources.
Computer science wouldn't help, either, because "most of our software technology was built in companies" and because computer science "strongly disincentivizes risky long-range research". Further, according to Edwards, the aversion to risk and "hyper-professionalization of Computer Science" is part of a larger and worrisome trend throughout the whole field and all of western civilisation.
Edwards' blog post argues that since 1996 "almost everything has been cleverly repackaging and re-engineering prior inventions. Or adding leaky layers to partially paper over problems below. Nothing is obsoleted, and the teetering stack grows ever higher..."
"[M]aybe I'm imagining things. Maybe the reason progress stopped in 1996 is that we invented everything. Maybe there are no more radical breakthroughs possible, and all that's left is to tinker around the edges. This is as good as it gets: a 50 year old OS, 30 year old text editors, and 25 year old languages.
"Bullshit. No technology has ever been permanent. We've just lost the will to improve."'
-- source: https://developers.slashdot.org/story/21/01/03/0010225
Cheers, Peter -- Peter Reutemann Dept. of Computer Science University of Waikato, NZ +64 (7) 577-5304 http://www.cms.waikato.ac.nz/~fracpete/ http://www.data-mining.co.nz/ _______________________________________________ wlug mailing list -- wlug(a)list.waikato.ac.nz | To unsubscribe send an email to wlug-leave(a)list.waikato.ac.nz Unsubscribe: https://list.waikato.ac.nz/postorius/lists/wlug.list.waikato.ac.nz