Building Your Own Laptop

This article <http://www.computershopper.com/feature/build-your-own-laptop> originated several years ago, but says it was updated last year. I am pleasantly surprised to discover there are such things as “barebones” laptops, albeit they still come with somewhat more than you get with a “barebones” desktop box. While there are a range of such laptops available from different vendors, certain components (wi-fi cards and MXM graphics cards) may not be so easy to get. The total price for the build is quoted as USD1352, while “comparable systems with GeForce 8600M graphics were typically running about $250 to $300 more than our build”. Wonder if that included the Windows licence. Also, regardless of the revised article date, it still mentions installing Windows Vista...

While it is good to see DYI laptops, the article is Published on August 1, 2008 and Updated on May 1, 2015. Glancing at the technology involved its all appears to be from over 5 years ago. Recycling centres may sell you a laptop with, say, a broken screen and/or some smashed keys for $20 to $40. Inside it, you may get:
2GB Ram, = 160GB Sata 2.5 inch disk drive Mini-PCI Express (PCIe) Wifi Removable CPU and GPU chips. A good screen that can be used to replace a broken one. A battery which might last 10 minutes.
If you shop around re-cycling centres, then over time, you may be able to buy 3 x broken laptops for less than a total of $100 and be able to build one good laptop that matches the sort of specs described in this article. Note: This doesn't account for your labour and driving around time and costs. If you then spend $100 on a new SSD sata drive, that should help with performance, and you'll have a pretty good DYI laptop. ==== On a similar DYI theme... What I've noticed in secondary school IT labs is 240VAC desktop units and monitors connected to a main server by wired ethernet. They require an air-conditioned room to dissipate the heat for the 30 x desktop units, the teachers server and the heat from the students. This configuration involves high costs in initial classroom re-fit and wiring installation, high cost of initial IT hardware, high power consumption to run the lab, and (in theory) the high cost of the proprietary operating system and software products. My DYI for an IT lab would be along these lines... 28 x Raspberry PI 3's that use +5V. 28 x LCD monitors that use +12V with HDMI input and cable. The RPi's are bolted onto the bottom of the monitors. So the students cant take them home. 4 x rows of desks that can sit 7 students each at their Raspberry PI and monitor. 4 x DC power supplies from old desktop computers - one for each row. For each row run +5v and +12V for the 7 x RPi3's and 7 x Monitor's in each row. For each row have a wireless ethernet router for that rows RPi's to connect to. This router will have a 1Gb/s ethernet cable back to a LAN card in the teachers server computer. 28 x wireless keyboards and mice are kept in a lockable cupboard, the students get them out, put them away for each lesson. 28 x breadboard kits with resistors, LED's, etc. are kept in a lockable cupboard, and can be handed out to plug into the RPi's when teaching GPIO port programming, etc. Each student is responsible to buy and bring to class his/her own micro-SD card that contains Raspbian and their data. (>$10 for 16GB) Each student is responsible to buy and bring to class his/her own earphones. (>$10, but they have probably already got a set for their mobile phone.) The teachers server would have 5 x 1Gb/s LAN ports, 1 for each wireless router and 1 for external connection to the schools LAN / Internet. If you allow 5W for the RPi, and 20W for the monitor, then you achieve 25 Watts of energy consumption per student. This is 700 Watts for the classroom (excluding the teachers server). This may be less than the power consumption of the lighting for the classroom, unless they have already switched to LED lighting. Air-conditioning of the room may be found to be unnecessary. ...Anyone got any design ideas and enhancements? cheers, Ian.

On Sat, 10 Sep 2016 06:25:18 +0000, Ian Stewart wrote:
If you shop around re-cycling centres, then over time, you may be able to buy 3 x broken laptops for less than a total of $100 and be able to build one good laptop that matches the sort of specs described in this article.
Good idea. Have you thought of offering such a service for those of us who are not so ... hardwarily skilled? :)
This configuration involves high costs in initial classroom re-fit and wiring installation, high cost of initial IT hardware, high power consumption to run the lab, and (in theory) the high cost of the proprietary operating system and software products.
You can guess the reason for this: all the software they know is Windows-based. Surely they would be using a lot of web-based stuff by now? In which case, the OS running on the client machine should matter less.

Have you thought of offering such a service for those of us who are not so ... hardwarily skilled? :)
This DIY approach is purely as a hobby where you practice your electronics and software skills while restoring laptops. You have to right-off your labour, travel, power and other miscellaneous costs. You can't make a livable wage out of doing this, unless you've worked out a way of living on less than $1 an hour in NZ. Hamilton has had a few companies try and make a business model out of restoring and selling second hand laptop's and they seem to fail, or they survive by having other product sales, like mobile phones, or by also running the laundry-mat in the shop next door.
You can guess the reason for this: all the software they know is Windows-based.
If the idea of a "computer lab" at a secondary school is to teach what is currently in the Digital Technologies NCEA curriculum, then from my review of the curriculum all operating system and application software can be free open source. ...but as you state, if all they know is Windows-based applications, then they won't be aware of this. The current "computer lab" mentality, is along the lines of raised floor tiles, the hum of CPU fans, air-conditioning units blasting away, looms of ethernet wires, with Intel and Windows and proprietary applications. My drafted proposal of a computer lab is normal rooms, no air-con, ARM and Raspbian technology with open source applications. In the future a "computer lab" will just be a room with a few chairs. The students turn up and pull out their mobile phones. All the IT applications that they need to learn to pass the NCEA Digital Technologies subjects, will be able to be done on their phone. Their homework will be sent to the teachers instagram account. They'll complete three years of NCEA IT studies and not ever know what operating system their phone uses. ...and I'd say this is only a couple of years away!
Surely they would be using a lot of web-based stuff by now? In which case, the OS running on the client machine should matter less.
Yep. All you'll need is a web-browser on your mobile phone. You'll be able to enroll to learn about IT at the "School-of-never-typing-anything-at-the-command-line-prompt". A few years from now and Computer Science graduates won't have seen a console terminal window let alone typed a command on one. Imagine the complexity of getting a computer science student to understand why they would type "ls -l" at a "$" prompt and then the effort required in trying to make sense of the text that comes back at them on the screen. What's this "drwxrwxr-x" crap? They'll have a lot better things to do than try and understand this archaic cryptic console terminal rubbish. This will only be taught to history students that want to graduate and become a museum custodian that houses historical computer technology. ...well I think I'll stop before I go too far off topic [?] I was supposed to be talking about the merits of ARM/Raspbian based computer labs. Let me know if you've got any ideas. I think it would be fun to build one, so maybe there's a school out there that needs one. cheers, Ian. ________________________________ From: wlug-bounces(a)list.waikato.ac.nz <wlug-bounces(a)list.waikato.ac.nz> on behalf of Lawrence D'Oliveiro <ldo(a)geek-central.gen.nz> Sent: Saturday, 10 September 2016 6:46:43 p.m. To: wlug(a)list.waikato.ac.nz Subject: Re: [wlug] Building Your Own Laptop On Sat, 10 Sep 2016 06:25:18 +0000, Ian Stewart wrote:
If you shop around re-cycling centres, then over time, you may be able to buy 3 x broken laptops for less than a total of $100 and be able to build one good laptop that matches the sort of specs described in this article.
Good idea. Have you thought of offering such a service for those of us who are not so ... hardwarily skilled? :)
This configuration involves high costs in initial classroom re-fit and wiring installation, high cost of initial IT hardware, high power consumption to run the lab, and (in theory) the high cost of the proprietary operating system and software products.
You can guess the reason for this: all the software they know is Windows-based. Surely they would be using a lot of web-based stuff by now? In which case, the OS running on the client machine should matter less. _______________________________________________ wlug mailing list | wlug(a)list.waikato.ac.nz Unsubscribe: https://list.waikato.ac.nz/mailman/listinfo/wlug

On Sat, Sep 10, 2016 at 10:31:27PM +0000, Ian Stewart wrote:
A few years from now and Computer Science graduates won't have seen a console terminal window let alone typed a command on one. Imagine the complexity of getting a computer science student to understand why they would type "ls -l" at a "$" prompt [...]
Well, I can assure that our Engineering graduates (Uni Waikato) are being required to learn scripting and the bash prompt in their first year computing studies. So, even if the computer science graduates can't (and to be honest, I am still bumping into many computer science students who can) the engineering students can. Cheers Michael.

On Sun, Sep 11, 2016 at 11:47 AM, Michael Cree <mcree(a)orcon.net.nz> wrote:
On Sat, Sep 10, 2016 at 10:31:27PM +0000, Ian Stewart wrote:
A few years from now and Computer Science graduates won't have seen a console terminal window let alone typed a command on one. Imagine the complexity of getting a computer science student to understand why they would type "ls -l" at a "$" prompt [...]
Well, I can assure that our Engineering graduates (Uni Waikato) are being required to learn scripting and the bash prompt in their first year computing studies. So, even if the computer science graduates can't (and to be honest, I am still bumping into many computer science students who can) the engineering students can.
Yes ENG182 is very cool. Nice that it teaches bash and Python. Shame about the C#. Having to catchup on the C# that they missed out on for future C# papers and no carry on of Python is something to work on. More Python savvy lab assistances/tutors would help also - though I've heard students stepping up to this which is cool, peer tutoring/learning. A sysadmin course similar to what they offer at Otago Polytechnic would be cool. Cheers, Willkam

On Sat, 10 Sep 2016 22:31:27 +0000, Ian Stewart wrote:
A few years from now and Computer Science graduates won't have seen a console terminal window let alone typed a command on one. Imagine the complexity of getting a computer science student to understand why they would type "ls -l" at a "$" prompt and then the effort required in trying to make sense of the text that comes back at them on the screen. What's this "drwxrwxr-x" crap? They'll have a lot better things to do than try and understand this archaic cryptic console terminal rubbish.
That has largely already happened. We have had an entire generation--perhaps two entire generations--brought up with the idea that the only way to use a computer is via a GUI, that a CLI is something archaic and unfashionable--even repulsive. But remember what a computer is good for: it is good for doing tedious, repetitive tasks that humans are lousy at. But the only way to automate such tasks is via commands and scripts. A GUI only lets you perform those tasks that were programmed into the GUI; it doesn’t provide ways to synthesize those built-in primitive tasks into more complex sequences. Instead, it is the human that has to manually perform those sequences, over and over. In short, a CLI puts the human in charge of the computer. A GUI is a great way to put computers in charge of humans.

A few years from now and Computer Science graduates won't have seen a console terminal window let alone typed a command on one. Imagine the complexity of getting a computer science student to understand why they would type "ls -l" at a "$" prompt and then the effort required in trying to make sense of the text that comes back at them on the screen. What's this "drwxrwxr-x" crap? They'll have a lot better things to do than try and understand this archaic cryptic console terminal rubbish.
That has largely already happened. We have had an entire generation--perhaps two entire generations--brought up with the idea that the only way to use a computer is via a GUI, that a CLI is something archaic and unfashionable--even repulsive.
But remember what a computer is good for: it is good for doing tedious, repetitive tasks that humans are lousy at. But the only way to automate such tasks is via commands and scripts. A GUI only lets you perform those tasks that were programmed into the GUI; it doesn’t provide ways to synthesize those built-in primitive tasks into more complex sequences. Instead, it is the human that has to manually perform those sequences, over and over.
In short, a CLI puts the human in charge of the computer. A GUI is a great way to put computers in charge of humans.
I do disagree a bit. Workflow systems are (often) graphical user interfaces that allow you to define complex (and repetitive) operations. These systems follow the unix philosophy of command-line utilities, by having lots of reusable components (aka operators) that do more or less one thing (but do it well). These operators you can then slot together into complex programs by only having to parametrize them. Once designed, they don't have to run through the user interface, you can have them running as background processes. Cheers, Peter -- Peter Reutemann Dept. of Computer Science University of Waikato, NZ +64 (7) 858-5174 http://www.cms.waikato.ac.nz/~fracpete/ http://www.data-mining.co.nz/

On Sun, 11 Sep 2016 12:21:41 +1200, Peter Reutemann wrote:
But remember what a computer is good for: it is good for doing tedious, repetitive tasks that humans are lousy at. But the only way to automate such tasks is via commands and scripts.
I do disagree a bit. Workflow systems are (often) graphical user interfaces that allow you to define complex (and repetitive) operations. These systems follow the unix philosophy of command-line utilities, by having lots of reusable components (aka operators) that do more or less one thing (but do it well). These operators you can then slot together into complex programs by only having to parametrize them. Once designed, they don't have to run through the user interface, you can have them running as background processes.
Sure. This is the old “graphical programming” idea that seems to be rediscovered every few years. The earliest example I can remember is Prograph on the Macintosh in the early 1990s--but then I think National Instruments’ LabVIEW (for laboratory instrument control) might have been earlier. Currently on various pieces of 3D software (including Blender) you have “nodes” for graphically defining intricate compositing processes and material characteristics. And there’s MIT’s Scratch and Google’s Blockly. But they have severe limits, as you discover when you try to use them for anything complex. How do you debug the thing? How do you do a diff? You made a change last week, and this week you discover that something has broken; can you figure out what you did to introduce the regression? How do you collaborate among multiple contributors? In other words, they work best in application areas where they don’t have to be full programming languages. Which brings us back to the automation issue: you still need text-based commands and scripts to be able to automate repetitive tasks.

But remember what a computer is good for: it is good for doing tedious, repetitive tasks that humans are lousy at. But the only way to automate such tasks is via commands and scripts.
I do disagree a bit. Workflow systems are (often) graphical user interfaces that allow you to define complex (and repetitive) operations. These systems follow the unix philosophy of command-line utilities, by having lots of reusable components (aka operators) that do more or less one thing (but do it well). These operators you can then slot together into complex programs by only having to parametrize them. Once designed, they don't have to run through the user interface, you can have them running as background processes.
Sure. This is the old “graphical programming” idea that seems to be rediscovered every few years. The earliest example I can remember is Prograph on the Macintosh in the early 1990s--but then I think National Instruments’ LabVIEW (for laboratory instrument control) might have been earlier. Currently on various pieces of 3D software (including Blender) you have “nodes” for graphically defining intricate compositing processes and material characteristics. And there’s MIT’s Scratch and Google’s Blockly.
My comments below apply to ADAMS, the workflow system that I developed and which we use for data processing and predictive modeling in production environments.
But they have severe limits, as you discover when you try to use them for anything complex. How do you debug the thing?
ADAMS allows you to debug a flow. Either by setting breakpoints and/or stepping through, operator by operator. Allows you to inspect variables and other things.
How do you do a diff?
Simply diff the text file that describes the workflow.
You made a change last week, and this week you discover that something has broken; can you figure out what you did to introduce the regression?
Source code and workflows are under version control. Unit tests for operators/workflows. Updates to the operating system can break your scripts, with options deprecated, output changed etc. Not safe from getting broken.
How do you collaborate among multiple contributors?
Version control.
In other words, they work best in application areas where they don’t have to be full programming languages.
They can be. ADAMS allows you to add scripted operators using jython or groovy. Useful for initial prototyping or one-offs.
Which brings us back to the automation issue: you still need text-based commands and scripts to be able to automate repetitive tasks.
They all have their place, something I never discounted. Cheers, Peter

On Sun, 11 Sep 2016 14:51:30 +1200, Peter Reutemann wrote:
How do you do a diff?
Simply diff the text file that describes the workflow.
Need I say more?
From within ADAMS, not the command-line. :-)
That’s great. Can you spawn “diff -u” and use that as input to patch? How is the two-way fidelity? If you were to automatically generate these text files from some other source and then open them in the GUI, is it liable to make spurious changes even if you don’t change anything?

On Sat, Sep 10, 2016 at 6:25 PM, Ian Stewart <ianstewart56(a)hotmail.com> wrote:
On a similar DYI theme...
What I've noticed in secondary school IT labs is 240VAC desktop units and monitors connected to a main server by wired ethernet. They require an air-conditioned room to dissipate the heat for the 30 x desktop units, the teachers server and the heat from the students. This configuration involves high costs in initial classroom re-fit and wiring installation, high cost of initial IT hardware, high power consumption to run the lab, and (in theory) the high cost of the proprietary operating system and software products.
My DYI for an IT lab would be along these lines... 28 x Raspberry PI 3's that use +5V. 28 x LCD monitors that use +12V with HDMI input and cable. The RPi's are bolted onto the bottom of the monitors. So the students cant take them home. 4 x rows of desks that can sit 7 students each at their Raspberry PI and monitor. 4 x DC power supplies from old desktop computers - one for each row. For each row run +5v and +12V for the 7 x RPi3's and 7 x Monitor's in each row. For each row have a wireless ethernet router for that rows RPi's to connect to. This router will have a 1Gb/s ethernet cable back to a LAN card in the teachers server computer. 28 x wireless keyboards and mice are kept in a lockable cupboard, the students get them out, put them away for each lesson. 28 x breadboard kits with resistors, LED's, etc. are kept in a lockable cupboard, and can be handed out to plug into the RPi's when teaching GPIO port programming, etc. Each student is responsible to buy and bring to class his/her own micro-SD card that contains Raspbian and their data. (>$10 for 16GB) Each student is responsible to buy and bring to class his/her own earphones. (>$10, but they have probably already got a set for their mobile phone.)
The teachers server would have 5 x 1Gb/s LAN ports, 1 for each wireless router and 1 for external connection to the schools LAN / Internet.
If you allow 5W for the RPi, and 20W for the monitor, then you achieve 25 Watts of energy consumption per student. This is 700 Watts for the classroom (excluding the teachers server). This may be less than the power consumption of the lighting for the classroom, unless they have already switched to LED lighting. Air-conditioning of the room may be found to be unnecessary.
...Anyone got any design ideas and enhancements?
I would add the Raspberry Pi 7 inch touch screen. https://www.raspberrypi.org/products/raspberry-pi-touch-display/ It costs around 140 dollars so increases the price of the setup. Use power banks (super cheap from China). No power in classroom is needed. New schools (RJHS) don't seem to have ethernet outlets. Chromebooks everywhere (with a few macbook airs) and Chrome desktops scatted. Cheers, William.
participants (6)
-
Ian Stewart
-
Lawrence D'Oliveiro
-
Michael Cree
-
Peter Reutemann
-
Peter Reutemann
-
William Mckee