Apple is a victim of their own success. It's a dilemma really. Their culture vs public interest.
I am sure they are exploring new spaces like VR, AR, cars, etc but they haven't finished a decent product yet so they don't announce it. They are the opposite of Google, where they announce every new project only to kill it later (Ara, Fiber, Glass, etc).
They don't really like to announce a product before it's ready. The root is their hardware background. They need to nail the product in the first iteration or it will fail. Move fast and break things don't apply to them.
On Apple's culture: Ship has sailed. Tim doesn't have the gravitas to ensure a new 'Wow!' product is going to ship. He's a damn good manager, but not a superb innovator. Unfortunately, Apple is at it's heart, a 'Wow!' company. I suppose Apple has 'topped off' now and is a mature business state. This is a good thing though! More room for innovations!
And GOOG-411 (didn't have a smartphone yet when they killed this, it was still useful), Google Labs (public access to experimental features in Gmail and other services), Buzz (jk nobody misses that), Reader, Code Search, Q&A, iGoogle portals, and plenty of others.
Didn't know that Google Labs was shut down. Some of the Gmail Labs were useful - and some have probably gone mainstream by now. The one that reminds you if you use a word like "attached" in your email, but have no attachment, comes to mind. I tried at least half a dozen of them, may have only kept using one or two. Still, the Labs idea was good.
"Wow!" products usually don't happen overnight though. Take the iPhone for example! Apple was playing around with ideas for tablets, handhelds, PDAs, and other stuff for decades before they launched that. A lot of "what if" prototypes were made and dead end projects ensued.
Google kinda had the right idea with 20% time, I think. In order to make truly ground breaking ideas, you have to have a little bit of margin for playing with ideas. If you make your entire business about trimming fat, making synergies, and maximizing the efficiency - squeezing it out of everyone involved, you're no longer an innovator, you're just an assembly line.
Whether Apple pursue the hardware route remains to be seen. They only changed their strategy, probably to develop the core self-driving technology first then worry about the design of the car later.
Because it's a MacBook; with batteries in it. The value is in the mobility. If you don't need the batteries and mobility, you could buy a Mac Mini, iMac or Mac Pro.
In 2016, "buy a desktop" is no longer the only answer when talking about professional work. Like--check out what Sager offers. You can buy a fully kitted-out 17" notebook from them that meets or exceeds the top-end iMac in every respect (equal CPU, twice the RAM, significantly better GPU) except pixel density (a mere 170 versus 216) for a thousand dollars less than the top-end iMac--and you can carry it with you.
It's not uber-thin and light, no; this is a literal desktop replacement and weighs twelve pounds. It's not the machine I'd buy, because I don't need a GTX 1080 in a portable machine--but that machine is equivalent to Apple's best, actually-serious desktop while being cheaper and portable! It goes without saying that you can scale down to something that's a little more reasonable for a physical-effort-averse nerdy type and still get something that's very favorable compared to Apple's offerings, both in terms of perf and price.
Apple missed the memo: you really can have both, these days. Maybe they don't care about that audience anymore, and that's their prerogative, but it's why I'm probably bouncing back to Windows/Linux. (Dual-booting. Ew.)
But they didn't upgrade the Mini, iMac, or Mac Pro. The Mac Pro has been the current model for at least two years. The iMac is a hack that uses two logical display panels to produce the 5K resolution, because there isn't enough bandwidth for a single one. The Mini is nearing the 3 generations old mark.
Why even mention the way the iMac display works? Using the machine you would never know and its a very nice computer. The complaints about the mini and pro are legit but there is barely anything to complain about with the iMac.
It will matter once Apple decides to drop support for the current iMac. (Who knows, maybe the ARM switch will eventually happen? The G5 iMacs became obsolete over night when the Intel switch happened.)
As it stands, you can neither recycle it as an external display nor can you run Linux on it at full resolution.
Not really a "pro" complaint - just something to keep in mind if you plan to hand it on.
>> Because it's a MacBook; with batteries in it. The value is in the mobility.
For some people. But you can also look at a laptop as being the ultimate "all-in-one" PC.
The only cable you need to plug in is the power cable. For a lot of people, that's more convenient than having to deal with a monitor, keyboard, mouse and desktop with the associated mess of cables and added desk space. That a laptop is portable and runs on a battery is gravy.
Yes. I bought the same model to replace my mid-2011 Macbook Air. Coming from an ancient one, this Macbook Pro is a reveleation.
Your experience echo mine. I love the new keyboard very much, no problem typing on it. It even has a pleasing sound when i am typing on them. It invokes ASMR. It doesn't feel too heavy, I can bring it around on one hand just fine. The build quality is amazing. Performance is good. From fully charged I can use it all day without charging. I am very much satisfied.
> Is your premise correct though? Will you need to move your head up and down? Isn't the touch bar will be in your field of vision?
Considering zero part of my keyboard is in my field of vision I'd say yes I'd have to move my head up and down. I haven't had a chance to play with the new MacBook Pro yet but it seems like a reasonable assumption.
> Even if your premise is true, is it better to move your head up and down or to move your hands forward and backward to reach the touch screen?
It's actually two different use cases here. A touch screen helps with touch type of controls. So precision zooming and scrolling is kinda awesome on one (though scrolling is weak with the nice touchpad). The touchbar is meant to be more utility like a row of function keys.
I don't think it's entirely fair to directly compare the two. For instance moving it to the bottom of the screen instead of the top of the keyboard turns it from easy to access touch keys to more informational. At least in my opinion.
> Considering zero part of my keyboard is in my field of vision
Zero? Is that even possible?
When using a laptop, it's impossible to look at the display without being able to also look at the keyboard (and touchbar) by just moving your _eyes_. You don't have to move your head.
That's probably exactly how the argument went at apple. However, doing dev without any external screens or keyboard is the exception not the norm. If tou are already bending your neck like that day to day then sure you have bigger problems.
I don't look at my keyboard when typing so other than the slight realization that it is there on a laptop I never look at it (but yes it would technically be in my field of vision in this case but that misses the point). However on a desktop setup / docked laptop the keyboard isn't anywhere near where I could see it without significantly moving my head down.
Regardless discussing field of vision is more of a red herring in this debate; it's really about the eye movement and context switching. I should have chosen different wording.
> When using a laptop, it's impossible to look at the display without being able to also look at the keyboard (and touchbar) by just moving your _eyes_. You don't have to move your head.
I think you confuse the physical act of moving ones head to be more disruptive than moving one's eyes. It's actually about the same when you're talking UX assuming the head is moving to change the field of vision. This is something you keep track of and many times head movement will be removed or ignored from testing data when you're tracking where the user is looking because you can move your head and still keep your eyes in one place.
Again you said: "Considering zero part of my keyboard is in my field of vision I'd say yes I'd have to _move_ my head up and down."
I hope you understand that you move your eyes when you look at different things on your display. The touchbar is just another display that is only slightly lower than the lowest part of your display. You don't need to move your head to see that, only your eyes. The same eyes you move when you look at different things on the main display.
> I hope you understand that you move your eyes when you look at different things on your display.
But of course. That was a huge part of my point. Moving your eyes off the display is a _big move_. It's not a typical UX pattern. These things are very well tracked in user testing with eye tracking. The idea is to incrementally move their eyes and never require large movements else you end up having UX issues.
Judging from the downvotes I'm guessing many haven't looked into or used eye tracking during user testing. It's quite interesting, you should check it out!
I think the touch pad is going to be one of those things people will just have to try, and it might turn out to be an awesome addition to the input tools at hand. Or it might one of those things that sounded good in theory, but is just a pain in practice.
One thing to consider is that things like editing video are already mouse centric, and the keyboard shortcuts augment the interface. You don't leave your hands in home position.
If you're working on the laptop, using the trackpad, you're already in the mode where you are going back and forth, and your hands need to move around.
When using a mouse, trackball, etc, you can get into workflows based on having one hand operating the mouse, while the other does the keyboard shortcuts. So there could be very interesting uses with two hands.
What makes it not so compelling to me, is that I do serious work with my laptop closed, and a big monitor (I'm music producer/audio engineer).
The amount of information you need to get on the screen at once, if possible, for media authoring applications, requires a relatively rigid focus on the screen. Having my head hunched over a little laptop display is not going to work for extended sessions.
The problem of creating physical interfaces that work well with the screen relates to the abstraction that is involved in GUIs. You have to keep steady focus on the screen, keeping track of the mouse, and what you are doing. You enter kind of a trance, pretty quickly, where you have accepted a two dimensional representation of a complex structure, and are operating within its framework.
This divorces the metaphorical actions of the user in the GUI, which is what we care about, from the physical actions of the body.
For a new user interface to work, it will have to be incorporated into this metaphorical framework the user operates in.
If it deviates to much from the abstractions of the interface, it breaks the trance required to get into the flow, and will simply not be used.
I suspect one of the issues will be if users can actually quickly look down to see what they are touching, or perceive enough with peripheral vision, without losing track of where they are on the screen. It's a reasonable location for the touch bar. As the user gets used to where controls on the touch bar are located, the visual cues might be enough to consistently target the virtual control.
It could also be really helpful for keyboard commands that have a lot of modifiers, and which you don't use much. They can be real handy at times, but the trouble of looking up the key binding in the heat of the moment prevents the habit from getting built up.
Apple's HIG[1] on the Touch bar is actually great. You can sort of understand the reason Apple put it on the Macbook. You can at least envision two interesting things:
- Touch UI for scrubbing/scrolling contents faster.
- Dynamic shortcuts for most used commands that previously only accessible by using the keyboard.
I thought this whilst watching Apple's event. All of the functionality they demonstrated on the TouchBar basically duplicated functionality already in each app via a shortcut. Eg. safari - you can open a new tab! (cmd-T). There's an address bar! (cmd-L). You can go to each tab! (cmd-number, eg. cmd-1).
And all of the gestures were what the excellent touchpad was built to do anyway. Scrubbing? Gestures will do that.
It's like they put two giant touch interfaces on the device so that they could fight for attention, and the poor developer has to write two handlers for the two event sources, whilst it does the same thing at the end of it.
I haven't actually tried it yet, but by watching the launch demo and several screenshots of touch bar on app like Photos or Final Cut Pro I can see that:
- Touch Bar is probably better on scrubbing/scrolling because you can actually see the zoomed out content, like the entire timeline of the movie on FCP.
- I haven't tried Boastr. I am sure it's useful and faster for pro. But on first look it's pretty complicated. It's not built in to the Mac and not easily accessible, especially for new user.
I think there's something to what you're saying here. That's often been an argument for the mouse/pointer vs touch screen as well. How is this different from using a touch screen in general? Or are you saying that's problematic as well? Having an additional Touch Bar to effectively increase screen real estate by removing the necessity of dedicating screen space to the slider control seems like a win.
I don't do video editing or anything like that. My experience with a slider is pretty much limited to Netflix and such on my tablet :) Seems to work pretty well there. I can see how it might not work as well for fine-grained work. But that's an argument for touch screens in general, isn't it? Or do you think there's a distinction between Touch Bar and touch screen?
Caveat: On some video sliders, you can move vertically as well to increase/dilate the resolution of the horizontal movement. You won't have the same room with the touch screen. A couple options I can think of: use the track pad or a key press to modify the Touch Bar tracking; or perhaps track the speed of movement on the Touch Bar to modify the resolution, e.g., slower movement, higher resolution.
No, sir. In fact you can still see both the current content on the Macbook screen and the entire timeline on the Touch Bar. On trackpad, you need to actually move the pointer to the content before you can scrub, and it's not really that fast. If you manipulate directly on the screen, isn't it will block the view?
IMHO, as far as I understand it, Touch Bar is a bridge that connects the static nature of the keyboard with the dynamic nature of the screen. It is not intended to replace keyboard nor the screen. It is there to augment and extend them.
Touch bar is perfect for this job, considering:
- You can't beat the fast input from fixed keyboard + muscle memory.
- Touch screen on a laptop is really not ergonomic, your arm will get tired fast.
I always enjoy installing and configuring complicated 3rd party software! This BetterTouchTool is way better than any out of the box "it just works" software...
O' how I despise the new trackpads that aren't merely trackpad pointing devices, but also buttons, scrollbars and other bullshit I never wanted too.
Christ. I need to move the fucking mouse cursor. And, shit on me, I did not intend on fucking clicking or swiping anything, or an other absurdly tangential but obliquely imaginable bullshit, God damn it.
The worst is when you have to press the trackpad to click the button, and as your fleshy fingers flatten out against the trackpad, it registers the pressure center moving slightly, and so the pointer moves, and you click the wrong God damned thing because the trackpad IS the button.
It was better when the buttons were separate plastic divisions disconnected from the motion tracking area.
Have you tried Apple's recent trackpads with the Force Touch sensor and haptic feedback?
In my experience, the fact that the trackpad no longer physically moves when you press down on it to click makes really cuts down on the problem of the cursor moving during the click, as does the greatly improved uniformity of the force required. Being able to adjust the force required to click is also quite nice.
I was never quite satisfied with Apple's hinged trackpads (or the standalone bluetooth trackpad with the click mechanism in the rubber feet) as a replacement for the physical button despite the trackpad size increase it enabled, but the latest generation strikes me as almost perfect.
The public out cry is because Apple removed the ESC key for no reason (and positioned it one row above on the touch bar). And many are angry that MacBookPro now costs even more, and has 2013-style hardware in 2016. If it was called MacBookAir and cost half there would be no out cry. A Pro-level notebook that is now way too thin for high-end hardware and too thin for common connectors and too slow for high end application needs like 4k video cutting & processing, etc. - well it's not a Pro-notebook anymore, more something for wannabes, life-style and hipsters article. As the other hardware desktop/mini/Pro/notebook stuff from Apple is stuck in 2013-models as well, it's scary to watch how they don't get it at all. So the touch bar isn't a problem, it's just one of many questionable decisions (the same happens at Microsoft, so many questionable decisions and deaf ear to the community) - Steve Jobs and Bill Gates are really missing.
IMHO, Apple removing headphone jack and legacy ports is exactly the kind of things that opposite of having "toner heads" running the company. If "toner heads" are really running Apple they will be scared taking the risk and moving forward because they know it will hurt sales. If "toner heads" are really running Apple they will not take the risks of creating a new interaction model with unproven reception among customers.
I love Apple for this and I am buying the new Macbook Pro tomorrow. :)
The headphone jack is not a legacy port. It is the single most impressive industrial design for a port in history. That's the reason it lasted so long and also the reason it will outlive any wireless nonsense apple brings out to increase their landfill impact.
Speaking for myself, I hate headphone cables. They always get knotted up when stored away, and they're always getting caught on things while I'm wearing the headphones.
I'm looking forwards to when wireless headphones are more ubiquitous and cheaper.
Sure, headphone cables are a mess. So use wireless ones instead. For me; not having to ever charge my headphones far outweigh that hassle. As well as any issues with pairing and sound quality.
> I'm looking forwards to when wireless headphones are more ubiquitous and cheaper.
So wait for that before you remove the port. Otherwise we're just trading 3.5mm cable hassles for Lightning adapter/cable hassles.
>>> I'm looking forwards to when wireless headphones are more ubiquitous and cheaper.
> So wait for that before you remove the port.
Err, I was stating my personal opinion on wireless headphones. I'm not Apple.
Obviously people have different preferences on removing the port. I'm glad that it will help push the availability of wireless headphones, and speed up their marketplace penetration.
> Otherwise we're just trading 3.5mm cable hassles for Lightning adapter/cable hassles.
Again, for my own situation, I wouldn't have any Lightning adapter/cable hassles.
The issue is really: how much do you need to be able to send before you reach the limits of human hearing (our ability to distinguish any differences), and will wireless be able to cater to that or a reasonable approximation to that?
> Surely you've seen a speed difference with a wired internet connection compared to wireless?
yeah, but note also that what I was responding to was you saying wireless would never have as good of a sound quality. "Never" is a long time.
That's a good point. If we are talking about the quality being indistinguishable to humans, it is probably possible at some point. Although I will probably stick with wired for a long time, as I have a nice wired headset, and I don't like the thought of my headphones not being charged!
I think that source is a little bit confused on the distinction between iOS and OS X (macOS?).
iOS and OS X share the same kernel, named XNU (it's the userlands of the two that are almost completely different). That's been the case since iOS 1. Whenever either OS X or iOS gets ported to a new CPU (or CPU subfamily), the kernel gets another set of macros added to it for the purpose of identifying the new CPU. The presence of a new ARM macro in XNU doesn't mean much at all - it already has a ton of them, as well as truly ancient macros from long-forgotten systems (m68/88k comes to mind). Those macros have nothing to do with OS X or iOS individually - they're relevant primarily to the kernel itself.
Every now and then Apple release a new product, like TVOS for the Apple TV and people ask why Apple wrote an entirely new OS. why not base it on iOS or OSX. of course that's exactly what they do. There's a common core to OSX, iOS and now TVOS that are almost entirely the same code base. Which specific files are common or different for each flavour probably changes from time to time and this may well be a case of that.
The fact is though, iOS and OSX are already as much the same code base as they can be, and as much different as they need to be. That balance may change as the OSes evolve, but I don't think there's any pressing need or benefit to converging them completely.
The necessary divergence you talk about is mainly the UX coming from different input methods for the pointer device, correct?
With the (partly regretable) adoption of UI design patterns that are mainly meant for touch devices, such a merge appears to be closer however. I think the main patterns missing are
* a more powerful touch-capable tiling window manager
* a method to get "mouse hover" events working on touch devices. finger hover?
* a both touch- and precision pointer friendly implementation of the OSX menu system.
I'm not convinced that employing 3D Touch to do the typical things of mouse hover is a good idea. Displaying help text to users being lost when they do a very specific gesture? It needs to be something that every user can immediately pick up on.
Maybe a stare-o-recognizer using the camera? If the user looks puzzled at a button, display text to explain ;-).
In and of itself that doesn't prove anything. That file has had symbols for multiple architectures since forever, including CPUs like M68K, Sparc (years ago, I had hoped that Apple would buy Sun instead of Oracle - oh well) and VAX (but not Alpha).
I am sure they are exploring new spaces like VR, AR, cars, etc but they haven't finished a decent product yet so they don't announce it. They are the opposite of Google, where they announce every new project only to kill it later (Ara, Fiber, Glass, etc).
They don't really like to announce a product before it's ready. The root is their hardware background. They need to nail the product in the first iteration or it will fail. Move fast and break things don't apply to them.