Upgrade Your Mac With A Touchscreen, For Only A Dollar

Imagine how hard it could be to add a touch screen to a Mac laptop. You’re thinking expensive and difficult, right? How could [Anish] and his friends possibly manage to upgrade their Mac with a touchscreen for only a dollar? That just doesn’t seem possible.

The trick, of course, is software. By mounting a small mirror over the machine’s webcam, using stiff card, hot glue, and a door hinge. By looking at the screen and deciding whether the image of a finger is touching its on-screen reflection, a remarkably simple touch screen can be created, and the promise of it only costing a dollar becomes a reality. We have to salute them for coming up with such an elegant solution.

They have a video which we’ve put below the break, showing a few simple applications for their interface. Certainly a lot less bother than a more traditional conversion.

[wpvideo fnmWafzC]

77 thoughts on “Upgrade Your Mac With A Touchscreen, For Only A Dollar

        1. Well, that’s really one word…

          But people accept fingerprints all over their phones and tablets (you just clean them periodically), so that’s not nearly as fatal an issue as the aforementioned gorilla arm syndrome.

          His Steveness talked about this many times. Apple has done a ton of research (repeating research done many times before dating back to the early 80s), and vertical touch screens, with a very few exceptions (basically kiosks where the maximum user interaction is measured in seconds or a handful of minutes), make baby Jesus cry.

        1. Everybody has used *horizontal* touchscreens for years – it’s how every smart phone and tablet work. But consider how those are used. Most people hold them at quite low angles relative to their eyes and more often than not are touching the body of the device with some part of the same hand that isn’t doing the pointing.

          That stands in sharp contrast with the expectations of a touch-screen laptop. They’re used at quite high angles relative to the eyes and you’re expected to be able to make very, very fine motor movements to direct the pointing finger mainly with the muscles between your elbow and shoulder. Even typing that description in makes me feel dumb.

          1. When I worked on Chrome team, we got first-edition Pixels. I used it as a plain-old laptop for like a year before using any of the touchscreen functionality. Within a year of starting to use the touchscreen, I was surprised to realize how often I was touching my MacBook Pro’s screen, and even sometimes my desktop screen, before sheepishly realizing that it wasn’t going to work. When I left Google, I got a touchscreen Chromebook for myself (an Asus, Pixels are too spendy for me!), because I really do value it. I’m still a bit bemused by the entire thing, because I was very anti-touchscreen.

            It’s honestly not a problem. I find that there are touchscreen things, and there are trackpad things. Touchscreens suck for positioning your cursor, for instance. But they’re quite nice for moving a map around, or scrolling, and _sometimes_ for pushing buttons. I’d say I use the touch screen maybe 1/4 as often as the trackpad, and I’d be willing to pay an extra $100 or so for the feature.

            Now, the Pixel C, an Android tablet with a keyboard add-on, pissed me right off. I tried using it for about 15 minutes before I returned it to facilities, it frustrated me so much. And my experience with my wife’s iPad-with-Bluetooth clamshell wasn’t much better. Those rely too much on touch for my tastes, and the keyboard just made me keenly aware of how poorly my expectations were being met. My expectations run towards technical work, I could maybe see them working well for writing up blog posts or emails.

      1. Same reason why ALL ‘air’ or ‘free’ or ‘3d’ input devices the user has to hold up in midair have been commercial flops. It’s quickly tiring to hold your arm up and waggle it around for a while. For dedicated uses like manipulating a 3D model it’d work because it’s a singular purpose use, like sculpting some clay.

        A 3D or vertical 2D input device is just not practical for a 2D interface for full time use, almost nobody has any problem translating horizontal 2D input device motion to a vertical 2D UI.

      2. It’s obvious you’ve never used a touchscreen laptop for any significant amount of time. “Gorilla arm” as an argument against is a fallacy. On a laptop, you wouldn’t use touch exclusively; you use it in conjunction with the keyboard and mouse/touchpad. A touchscreen is perfect for direct gross movement scrolling, zooming, dragging and selecting and it’s much more quicker and productive to do so. So much so that it’s annoying to switch back to a laptop without it.

        1. That’s what trackpad gestures are for. It’s far more efficient to move your hand slightly down from the keyboard to manipulate the trackpad than to move it upwards not only easily twice as far but in a different plane to mess with the screen.

          1. Trackpad gestures correspond only indirectly with the action on screen with the user entering input on a plane different than and in a position away from the screen and so impose an additional cognitive and physical burden. Touchscreen gestures directly correspond to on-screen actions and are executed quicker and more precisely. Trackpad gestures only imitate direct screen manipulation and you must first locate your hand on the trackpad away from the thing you are trying to manipulate to the location of trackpad and then perform a gesture at non-1:1 scale while staying within the trackpad limits. Reaching up to manipulate the screen is more natural, quicker, and easier to perform than moving away from the screen to manipulate the touchpad.

    1. Because they can’t design a screen coating that doesn’t disintegrate without being touched.

      My 2012 MBP is now on it’s 4th screen (at least Apple recognised they had an issue). The original and 1st replacement both started to lose their coating after about 1.5 years, the 2nd replacement had a bright spot so that went straight back. The 3rd replacement’s been going 6 months so far…

      Quite good actually, for a 5.5 year old laptop, the screen is 6 months old, the [battery, keyboard, touchpad and upper case] assembly is 1 year old. Just the bottom that’s falling apart, the feet aren’t glued on but somehow plastic welded to the bottom panel, which doesn’t last forever.

        1. Sorry, but as the user I keep mine and everybody else’s fingers well away form the screen. The sleek design of the machine, means that the screen comes in very slight contact with the keys when closed, such that you see outlines of the keys on the screen over time. These are just grease marks from the keys which can be wiped off. The disappearing coating in now way had a pattern matching the keys when it began, although over time the key marks did contributed.

          Oddly in the wild, despite asking my boss whether anybody else had reported this issue (possible other’s don’t care – the amount of other MBPs I see here at work that are absolutely covered in fingerprints means they probably wouldn’t notice/care if it did happen). Last week I was at a meeting and just happened to be sat next to someone who when I glanced at their screen instantly saw they had this issue. That’s the only other time I’ve seen it…

          As for the replacement keyboard etc, I was hinting that when it’s time to replace the battery after 4 years, you get a ‘free’ keyboard/touchpad/upper chassis thrown in since the battery is well glued in to the upper chassis.

      1. Can you give a breakdown of the costs involved and risks taken? I think many people seriously underestimate what goes into releasing a proper product. A product that actually works in various conditions, not just a pet project with all kinds of quirks people just have to deal with.

        If there’s any doubt, I think I just have to point at Kickstarter to see how hard and costly it is to actually get something decent going.

  1. Was hoping I could refit my old Powerbook with touch screen, but the touch screen requires camera and early Powerbooks (basically any 680×0 models) didn’t come with any.

      1. Even better! (I think) Because the USB camera could be placed further back from the screen to allow better definition of location as opposed to the limit of finding the “depth” of the touch (i.e. the distance the touch occurs from the mirror right next to the screen).

    1. You could find an Apple Quicktake camera, from 1994. But I don’t think it has enough resolution, but it connects via the serial port. Kind of large to mount on the screen, likely need a separate tripod. I’m not sure it works like a webcam, offering continuous updates. The only way to get pictures off it is via the serial port. And is the CPU good enough to interpret the pictures from the camera as well as do the main work?

      But it reminds me, didn’t Doug Englebart do some work about using cameras for this sort of thing? It was the same time as he started playing with mouses.

      Michael

  2. It would cost notebook manufacturers probably not much more than $1 to add a retractable mirror to the already built-in camera.
    And the software, given the quantities, would cost next to nothing.

    I wonder in someone will do it.

    But talking of touch screens, I have a HANNS-G HT271HPB monitor, 27″” with touch, and while it works very well with Android, I have to say that neither Windows 10 nor Linux are really ready for touch only operation.

          1. nsayer: “Apple would not do it because vertical touchscreens are a terrible user experience.”

            They’d do a small modification, and use the webcam to detect keypresses on a totally flat, unresponsive keyboard, the better to make a macbook that’s thinner than anyone really wants.

          2. nsayer: “And they’re also a horrendous user experience. ”

            Exactly my point. Jony Ive’s fetish for thin and light shall not be stopped by mere concerns about usability.

    1. Windows 10 is amazing for touch… not so much the operating system and being a touch only interface… that is still clunky… but software that allows me to mark up drawings, take notes, and whiteboard remotely makes it way more powerful than anything I can do on my android tablet ( which has a pen as well)

    2. Double win.
      Since many people are paranoid about the camera on their laptop and you can buy shutters for them that stick over, said flap would be designed such that it can cover the camera, angle or be fully open.
      Stops Apple from using the telescreen and completing your 1984 experience.

    3. You could even make the mirror function as physical cover for the camera if not in use to prevent any possible spy from using the camera against you. It just needs 3 positions: fully down (close), 45° down, still no view of the user (touchscreen) and fully up, open the camera for use.

  3. Anyone know if there’s a 3D printable design available, otherwise i’ll have todo it :D

    But this basicly means that any modern laptop would have the same capability.

    1. But is there a real difference? I don’t know, but glass is glass.

      There were lightpens decades ho, you had to hold the tip against the screen.

      HP had the HP-150 computer around 1984, using a touch screen instead of a mouse. I once dragged home an RGB monitor that maybe had touchscreen, I think it used sensors around the bezel, so the CRT was “normal” unless they specified it for constant touching.

      Michael

      1. the first touchscreen i saw was mounted to an IBM PCjr. there’s a 1985 advert in PC Mag for the ‘Soft-Touch’ (uses infrared in a bezel, hooked to a card inside PC). “The user can program applications for the Soft-Touch in BASIC or assembly languages” it says.

    1. THAT is the thing that came to mi mind when I read the bit about “gorilla arm”. A touchscreen IS uncorfontable to use for long periods of time (I have one), but this would be useful and cheap to add to a magic mirror.

  4. Cool idea! Now I hope for a generalized open source version. I mean it looks like any computer screen or, more generally, any reflective flat surface can be used for touch input with the underlying method. Would be real nice to have a package that works with the Raspberry Pi camera.

    1. Apple has been doing touch screens on its devices for over ten years now.

      They just don’t do touch screens on their *laptops* (setting aside the touch bar – it’s a *horizontal* touch screen, after all) because, as I’ve mentioned above, it’s a stunningly bad idea.

    2. Apple customers have a “reverse price sensitivity”: Apple stuff has to be extremely overpriced, so the customers can feel more elitist in their golden cage. I don’t buy stuff with the rotting-fruit logo.

  5. Apple could create touchscreen Macs in a way that’d offer benefits without adding hassle and sell quite a bit more iPads to boot. All they need to do is come up with a way to add iOS apps that communicate with Mac apps, allowing what’s displayed and touched on an iPad to control a Mac app. Think of Mac sound, picture or video apps whose controls are on an iPad and adjusted by touch.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.