For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more nathantotten's commentsregister

I recently moved and took everything with me. This included replacing z-wave light switches and outlets with normal ones. I swapped everything out before we listed the house. The reality is that most people will just be confused by this sort of stuff and you aren’t going to get any extra money when you sell your house. It’s a pain, but assuming you don’t move very often it’s worth it. Now I have all the things in my new house and it’s setup just the way I like.


The not adding value to the house bit is important. Nobody really knows about or cares enough about home automation to justify an added expense on the cost of the home. Maybe this will change when the "iPhone of home automation" finally comes out. Also Z-wave stuff is EXPENSIVE! so if it's not going to up the value of the house then I'm basically giving that stuff away for free.


The not adding value to the house bit is important. Nobody really knows about or cares enough about home automation to justify an added expense on the cost of the home.

I'd actually go further. I do care about home automation, because it is almost certainly a negative and possibly a deal-breaker for me to put any offer in to buy a place.

If it has anything installed that has any form of off-site access or any dependency on a remote service then I am going to require the existing owner to remove it and make good at their own expense as a condition of any offer (other than possibly a suitably approved alarm system with clearly detailed monitoring arrangements and contacts).

Likewise if there is anything installed that relies on any sort of sensors or "smart" control systems, I am going to require credible written guarantees about the security, privacy, safety and longevity of that system, or far more likely it is again going to have to be removed and made good at the current owner's expense as a condition.


The interesting legacy corollary here is that it's completely bog standard for home security systems to be left in place when a house is sold.

I say this because the systems that ADT, etc sell these days are largely made up of white labeled COTS Z-Wave sensors with the security panel just being a hub and interface to the Z-Wave network.

The house I moved into this past year is covered by sensors for an old ADT system that are useless to me, and I've been slowly removing them and replacing them with equivalent Z-Wave e.g., door and motion sensors as I get to it. Whenever I move, I'll take the hub with me. Any sensors that get left behind would be almost entirely equivalent to an old security system being left in the house except for the fact that they'd be pairable with any system the new owner decides to put in place since they're actually standards based.


Just as well, if you plan to sign up with ADT, they insist upon new equipment. I just went through this. We bought our house in February. ADT was sending crap almost daily so I called to see what they were offering. They absolutely would not use the 3-year old equipment in the house. Installation would come with new equipment. I didn't care either way, but thought that was notable.

No matter; I wasn't interested, and I was happy to get rid of all the their signs and stickers.


Yeah, it's the reason I just went straight to tearing them all out. There is literally nothing ADT's whitelabeled sensors do that standard Z-Wave devices can't at a third the price -- or less! -- without tying me to any specific security company. If I decide to move to any other HA system, I just replace the hub and pair my existing devices back up.

In fact, ADT's latest move seems to be piggybacking on Smart Things for HA, but... since I already have Smart Things what do I really need ADT for?


Do you have any links to information about standards based products? I haven't run across any casually, which is why i haven't looked.


Most whitelabel home automation stuff runs on the Z-Wave protocol.

Zigbee is also fairly popular for some applications; it's what e.g., Philips Hue runs on.

Devices using either of these will generally pair with any hub with the appropriate radios like Smart Things.

There's additionally a class of home automation stuff that runs over WiFi; they more generally have proprietary communication protocols and I personally avoid them for the most part.


Mostly, I don't care about home automation. I don't care so much about the previous owner removing the components, because I'm sure that I can do it when I move in. Anything that can be set up for remote access is at least being disconnected, if not removed entirely.


"if there is anything installed that relies on any sort of sensors"

So, no smoke or CO2 alarms in your house?


Especially things like those. If you can give me evidence they are properly set up, working, and compliant with all applicable advice and regulations, that's great, I'll be happy to have them. Otherwise, get rid of them, and I'll get my own properly fitted by a suitably qualified professional, thanks. Those things save lives, and I'm not going to cut corners to save a few bucks, only to find that your hack job installer didn't connect them to repeat alarms properly throughout the building, or you used some trendy junk that doesn't actually work reliably if the power goes out, Internet is down, or some other dumb thing that should have nothing at all to do with safety systems ever.


All smoke alarms use sensors, whether or not they are connected to the Internet.


No kidding...


There are people like me who actively avoid home automation too. Seeing a nest at an open house is a negative.


As a home buyer, why is it a negative? It seems like it would represent such a small % of the purchase price and could be easily included in the negotiation as needing to be removed by the owner before you move in.


Yeah I agree. But I still get a negative psychological reaction since I dislike any and all smarthome products, which colors my impression of the property. Illogical I know, but it's a reality for me and other home buyers / home sellers.

It's like staging a home right? None of this furniture comes with the house but how it's staged absolutely influences your impression of the home.


I get that psychological aspect - it makes you think the current owners have poor judgment, and so what else did they mess up? That's also why you should do simple maintenance and cleaning when trying to sell a house, because if a person neglected to eg rake the leaves, who knows what else was necessary which they didn't do.


I would consider it to devalue the home. I don't ever want that kind of thing in my house but, even if I did, I'm not going to trust anyone to properly transfer whatever cloud access they have to things in my home, so it will all have to be removed anyway.


I think a similar effect has been why power generation and efficiency systems (solar, "geothermal" heating/cooling) have been slow to catch on—for every buyer who sees them as a nice thing there's another who's turned off by them, and another who doesn't care enough to pay much extra. And even the ones who think they're good probably won't pay the pro-rated cost of putting it in, so you're still losing some money. Kinda like having a pool. Also why builders don't usually put that stuff in when it'd be easiest and cheapest, I think. Even at those lower costs they can't make the money back when they sell.

So they're a good idea if you think it's very likely you'll stay in a house 15+ years, but not so much otherwise.


Where I live geothermo heating/cooling is listed in the ads for some houses. I recall when we were looking the realtor made sure we knew which houses had geothermo. We ended up with a regular air source heat pump, but geothermo is recognized as a selling point at least even if it doens't add value.


In my market pools are a negative to the sale price, they are immediately removed upon closing.


This sort of thing happens with used cars. If it's all stock, the value is general higher than if the car has been modified with various aftermarket parts. I believe the main reason is that a stock vehicle is less of an unknown when it comes to repairs and maintenance.


stuff like engine chipping puts loads of extra stress on whole engine/drivetrain/clutch setup that original car was not designed for. There are really no free lunches out there.

Even aesthetical changes are a big warning - this one has been somehow tampered with, and you have no clue what was changed and for what reason.

For example I bought some older BMW few years back. When i went for some complete checkup after cca 2 years, mechanic told me that car has been lowered a bit and brakes are some lightweight version. It was my first car and I didn't notice, and effin' dealer didn't mention it. All good but - lowered car has issue in places like underground parkings, where road is steep and sometimes I hit the bottom of the car when road bends too much. So because some idiot previous owner wanting to squeeze a bit extra driveability out of a diesel car (although amazing diesel it is), I now have these crappy consequences and have to drive carefully like with some ferrari. I've hit the bottom in various situations maybe 20x so far, maybe one day it will break something crucial and I will be left in some nasty situation with broken car.

Yeah, you want untampered (more) predictable stock cars.


If your house uses good standard local communication protocols, like Z-Wave, ZigBee, or Insteon, you can take your cloud devices, and a new owner can just buy a new hub device with their own cloud account and connect the devices to it. Doesn't matter what the other person has in their account at that juncture.

That being said, if I was ever to leave home automation hardware behind, it'd be a "if you want to pay x on top of the already agreed upon price for the property, I'll leave it", where x is the cost for me to just buy new modules for my new place.


> you can take your cloud devices, and a new owner can just buy a new hub device with their own cloud account and connect the devices to it.

That sounds like a pain in the ass.

> if I was ever to leave home automation hardware behind, it'd be a "if you want to pay x on top of the already agreed upon price for the property, I'll leave it

I don't want to buy new crap to deal with the crap you left behind. I'd argue that you ought to pay me for dealing with some arbitrary hardware that I've now inherited. This is the equivalent of leaving some finicky boiler system whose spare parts can only be ordered from a single factory in Germany.


I don't want to buy new crap to deal with the crap you left behind. I'd argue that you ought to pay me for dealing with some arbitrary hardware that I've now inherited. This is the equivalent of leaving some finicky boiler system whose spare parts can only be ordered from a single factory in Germany.

In context they are also more or less offering to remove it for free. I expect that includes putting conventional fixtures in (depending on the terms of the sale, but why bother trying to insist that the house doesn't come with light switches).


Indeed, an option like that would be purely: If you're going to want these, I can save myself the effort of swapping them if you want to buy them off me. Otherwise, I'll either reinstall the original equipment or buy new equipment. (New dumb light switches for my old condo cost me $1.95 a pop.)

At my old condo, one thing I left as a bit of a 'gift', was a extremely solid dual-arm TV wall mount. It was cheap for me to repurchase for my new place, and then they don't have the risk of screwing into the studs near but not in the holes I had mounted mine or such concerns that might weaken their own mount. I had all the hardware and instructions, which I left with it. This might've been a less handy choice had either the condo been large enough for other places for the TV to make sense or be appealing, or if the TV mount I had installed wasn't sturdy enough to support all possible TVs one might be bringing in.


Where I live the general rule is anything that was attached is considered part of the home you're buying/selling. If you don't want the light switches and thermostat being sold they should be removed and replaced before you show the home.


What is or is not included in the home sale should be explicitly listed in the property listing. Many houses are shown with contents that will be removed before sale, so when you buy a house, if there's something you're expecting to get as part of it, you should explicitly document it in your offer terms.


There are always going to be local norms and laws though. Where I live and recently bought a home the default is that anything bolted/screwed down is part of the home you're buying. Furniture, art, etc is not included. Most sellers will include the major kitchen appliances, washers and dryers are on a case by case basis. When I bought, the sellers wanted to keep some shelving and curtain rods and this (along with the major appliances) was explicitly covered in the contract.


>> That being said, if I was ever to leave home automation hardware behind, it'd be a "if you want to pay x on top of the already agreed upon price for the property, I'll leave it", where x is the cost for me to just buy new modules for my new place.

If you live in any major urban center that request would likely get laughed at since the value of your depreciated hardware likely wouldn't even count as a rounding error on the purchase price. Some buyers might even want a discount on your selling price because of the extra work they'd have to do to get rid of the gear.


Who said anything about the cloud? I know what you mean though; if I had to use a cloud service, I definitely wouldn't want it in my house. In my experience it's easier to set this stuff up without the crummy proprietary cloud services anyway.


But even if you do set it up without the cloud services, can you prove that to a buyer who isn't experienced with the technology?


We may be listing ours soon, and if we do I'm even swapping the LED bulbs for whatever's cheapest at the store. They're only about a year old and the house will sell for zero more dollars if I leave them up.


How much are the LED bulbs? I don't think I'd bother taking the time to save fifty bucks on a house move.


Heh, it'll be less than 1% of the total time I spend doing touch-ups and such. Maybe three minutes to save, yeah, $50-75. A tiny extra amount of effort on the actual packing and move, but time saved buying new bulbs when the ones in the next house start burning out makes up for that. Besides, I've got most of 'em in in-socket splitters (makes one bulb socked into two) in crappy basement fixtures which I'm pretty sure will make an inspector look twice, so probably worth taking down ahead of time anyway.


Three minutes to change all the bulbs?


A half-dozen lightbulbs, and that long for taking the new ones out of the box and moving the ladder around.


Sure, if you have nine people helping you.


At least in the UK some packs of LED bulbs go up to something like £50 for 3.


That's one way to look at it. Another way to look at it is that spending money and effort replacing good bulbs with not-so-good ones is a negative-sum action for the system of you+the new owner. Leaving them in is zero-sum. I left all the LEDs I put in my last apartment when I moved.


Is leaving lights in the house you sell an American thing? Unless it's built in lights like spots, all the bulbs and even fixtures usually move with the person where I'm from (the Netherlands).


Incandescent bulbs here are so cheap (and fragile, especially if they're used) they're definitely not worth bothering with, and they're still really common. Even CFLs are really cheap now, and don't travel as well (or last as long) as LEDs. Light fixtures would almost never go—either they're $20 pieces of crap that aren't worth unwiring (then wiring back up at the new place), or they're nice enough and suited specifically enough to the space that the new buyers definitely want them to stay. If you wanted to keep one you'd have to take it down before listing, and replace it with something else.


the fixtures are wired in, and thus legally part of then house. You need to add additional language in the paperwork if you want to take them. If you want to take one with you your realtor will probably advise you to replace it before you list the house. Of course as other have pointed out, fixtures are generally either generic, or specific to the room they are in: in either case it doesn't make sense to take them with you even if you can.

Technically you can take the stove, fridge, washer, dryer, window coverings; but almost nobody does. The buyers realtor will automatically put a clause in the contract that they stay. It saves everybody the effort and cost of moving them, and generally the rest of the room matches them and not the new house. Sometimes this isn't done but that is rare.

Light bulbs are not in the contract, but they aren't worth the bother to remove.


I've never moved anywhere that I can remember already having a fridge, washer, or dryer. We've always taken ours with us. They seem like more personal appliances to me, anyhow. And in my current place, all the window coverings were removed before the home was shown. I plan on leaving mine in when I eventually sell. They were custom-cut anyhow; I suspect that the old ones were removed because they were damaged.


The whole "take the floor with you when you leave the apartment" thing that is common in the Netherlands is really very bizarre to Americans.


Not just to Americans. I'm an expat in the Netherlands and interact with lots of people who came here from various different countries. It's bizarre to everyone.

Seems like there are two options. (1) You can do it the Dutch way, i.e. every time someone moves they'll have to take their floors out. Those floors often won't fit into your new apartment, so you're just creating lots of waste. Not to mention all the effort associated with it. You will have to remove the floors, but also the person moving into your place will have had to remove the floors from their old place. (2) You leave the floors in when you move out. Then the new tenant/owner can decide whether to keep them or not. In the worst case, both new and old inhabitant will still decide to remove the floors. In the best case, no one will have to do anything and no waste is created.

So what could possibly be gained from the first way of doing things? Genuinely curious if someone knows the reasoning behind this.


Ripping up floors is an actual thing there?? I assumed the original poster was speaking metaphorically about how much gets taken out. I wonder how that practice got started. It's almost impossible to reuse floors.


>Ripping up floors is an actual thing there?? I assumed the original poster was speaking metaphorically about how much gets taken out.

Haha I was not...people literally take the floor laminates with them when they leave, if not to reuse them but out of some sense of ownership I don't fully grasp.

Apartments are rented in a few ways: fully furnished (incl. floors and all the furnishings and even kitchen stuff) which is aimed at expats who make lots of money and people who are living short term; partially furnished (maybe including floors and some appliances or fittings, for which you'll pay a regulated monthly fee), or bare, which is as it suggests without fittings and without floor coverings. The latter is cheapest and most common I think.

I've heard that sometimes you can make a deal with the leaving tenants if you'd like to keep their existing floors (for a price, of course!). People here seem rather adept and used to installing the laminate themselves, and they also seem to like the idea that you can put down whichever floor covering you like best.

It must be a good business to sell floor laminate...


In the UK part of the sale process is agreeing which "fixtures and fittings" are part of the sale. There will generally be a form in the mountains of paperwork involved in a house purchase.

Most sellers will leave anything fixed to the building like lights, sinks, toilets, doors, door handles, sockets, fitted carpets, fitted kitchen appliances, plants that are buried in the garden, fires and fireplaces, television aerials, boilers and radiators, built-in cupboards, permanently installed mirrors, and so on.

Things that aren't fixed in quite the same way have different conventions. For example paintings, light shades, freestanding lamps, rugs, curtains, plants in freestanding tubs, and freestanding appliances are often taken by the seller.

It's always possible to negotiate, of course; if the buyer and seller agree on something different, that's perfectly legal. And there are horror stories of overly trusting buyers discovering the seller has removed the turf from the lawn, and things like that.


Finland here - rented one flat, bought two more. In all cases light-bulbs were absent, as were light _fittings_.

All we had were some sockets in the ceiling into which you'll fasten your own bulb-holders/shades, and then your own bulbs.

The only lighting present by default was that above the bathroom mirror(s).


The correct answer here RE "an American thing" is "whatever is written into the contract as 'conveying' with the house."

If you say "fixtures stay in the house" then they stay. If not, no obligation to leave them. Typically, however, these clauses deal primarily with appliances and other larger items (TVs, hot tubs, etc)

In my experience people don't think to bring the bulbs with them, written or otherwise.


Common enough here in the Netherlands if you're buying a house. Expensive or heirloom fixtures may have been replaced with cheap IKEA fixtures, but houses generally don't show up on the market without lighting and a working thermostat.

Perhaps you are thinking of rent?


That would tend to make the inspection difficult.


Lights in light fixtures usually stay, while lights in lamps go. I don't know anyone who removes the lights from overhead fixtures, ceiling fans, etc. In the house I just moved into, all of those lights were there, even if the light switches were not.


In Italy it’s common for people to take their entire kitchen (including cabinets and fitted appliances) when they move.


I added z-wave switches to my house, but honestly they function enough like normal switches that I would just leave them if I sold the house (but take the controller with me). I never used any cloud services though.


Yeah, they had a post form that had a field for username that was available only to the founders. So they could post content and then invent a username to make it appear that there were lots of unique users. They discuss it in this podcast. http://one.npr.org/?sharedMediaId=545635014:547386946


This is a stretch IMO. I was working there too at the time and the focus on apps was never as strong as it should have been. Yes, they paid out lots of money to get devs to build apps, but they never really dedicated the resources to building quality apps. The prime example was Facebook. This was an app built by Microsoft with FBs blessing, but it was always far behind in terms of features and quality. Developement of that app and other flagship apps was not a core focus. Work was outsourced and not given enough resources. Had Microsoft put quality dev teams on building high quality third-party apps I think the chances of success would have greatly improved. From my prospective the focus was on filling the store with apps regardless of quality. This convinced the first generation of Microsoft loyalists to buy Windows Phones, but turned off many of those people. Most would not go on to buy a second WP or recommend them to friends and family. This includes many (dare I say most) Microsoft employees who where enthusiastic about the product at first, but when it came time to buy a second or third device moved on to Android or iPhone.


I was going to chime in a little on it being a stretch. One of the promotions for students they did was near the summer of 2012 or 2013. The promotion was you got paid 100 dollars for every app published on the Microsoft store limit 5 for the mobile store and 5 for the regular store. So a total limit of $1000. My school actually had a Microsoft rep run a workshop over a weekend showing students how to publish an app on the store. He gave us a template for a number of apps to "test" with. I made about $300 that weekend by publishing 3 different variants of a wackamole game. The workshop I attended had about 25 students total and we all left publishing at least one app. Idk how wide spread this outreach was. I checked on my apps sometime last year and they were all still up. I ended up pulling them out of a mix of shame and embarrassment.


I was a student who had won a Lumia 800 around May of 2012. There was a promotion where anyone who submitted 4 apps to the appstore would get a phone - no 'win' involved, a guaranteed phone. It was one of the best promotions I'd ever seen, and I promptly churned out 4 soundboard apps in a week.


This is unfortunately true, but it isn't representative of the entire effort put forth to acquire apps. I was part of the Microsoft org who was doing this at the time. We were split between breadth engagements (one to many like at universities or hackathons) and depth engagements (one to one). I was working depth engagements helping established companies port existing iOS and Android apps to Windows. The amount of investment from Microsoft in those depth engagements ranged from me helping out with technical barriers for a couple days to hundreds of thousands of dollars in incentives and development effort. It was all about how desirable that name or brand was on other platforms.


Did everything they could...

Ah, yes. $100 per app. I'm sure that's what most apps cost to produce. /s

It seems like they did a few things, but never actually, you know, paid app developers to build out their ecosystem.


Hey, if you're optimizing for "number of apps on our store" I bet it worked great!

Why spend $100,000 developing something one app when you could get 1000 for the same price?!


Instead of paying per app, would make more sense to let devs keep 90% of store revenue, which would incentivize the development of apps that are actually popular and make money.


Hell, let developers keep 100% of store revenue, at least until you have a customer base.

It's not much worse for MS than 90%, and it makes a much nicer marketing story if you're trying to convince developers to get on your platform.


It was a slight eye opener for me as a student at the time getting near graduation. The part of the workshop going through the app submittal and approval process was actually really interesting, but when it became clear that the bigger effort was to boost app numbers in the store things felt really dirty.

Either way that experience always comes up for me whenever people talk about the low quality of apps on the Windows store.


> "We have millions more apps than the competition."

> leaves out the fact that 99% of them are either web wrappers or low quality games

This is why we have confounding factors, kids.


Even with paying out this money for shovelware they never matched up to Apple's app store numbers.

I had a Surface Pro 3 at the time and there were maybe 10 apps worth using on a touchscreen. Eventually gave up and sold it to get a Mac and iPad.


They tried that too. The Verge reported in 2013 that Microsoft was paying some developers $100,000 each to port their apps to Windows Mobile. The article hints that was how Pandora & Temple Run ended up on Windows Mobile.

https://www.theverge.com/2013/6/15/4433082/microsoft-paying-...


That certainly wasn't the only program. MSFT paid cost for my mobile dev company to port games, because we had an established brand on iOS/Android. $100 for whatever random college students come up with seems very reasonable.


That's probably where those ads on Craigslist coding gigs come from.

"I'll pay you $100 or split the equity for my cool new app idea!"


Agreed. The parent post reads like "the overwhelming numbers of our competitors beat poor Microsoft despite our talent, ability and courage!"

No, Microsoft beat Microsoft. It was their game to lose.


It was their game to lose ten years ago, but only barely and not recently. In 2007 when the iPhone was launched, Windows Mobile had about 40% of the smartphone market, RIM had about 20%. But the smartphone market was nothing compared to today, the vast majority of phones were feature phones. Nokia's array of candybar phones absolutely dominated in 2007, with the Moto Razr was still big. Then Apple unveiled the iPhone, and the guys at Android said, "Oh shit." Meanwhile Steve Ballmer said the iPhone would never succeed. Ballmer drove MS into the ground. Everyone pivoted to the iPhone model except MS, who spit out WinMo 6.5 in 2009, and finally WinMo 7 in 2010. By 2010, the race was pretty much over. The rest of what MS did was half-assed at best.

You're 100% right, MS beat MS.


And frankly Winmo 7 was the bad move, not 6.5.

Because 7 burned the app bridge with 6.5, thus making it ever easier for someone to justify moving to a different platform.

Never mind that at launch iphone was more fancy featurephone than smartphone.


> Never mind that at launch iphone was more fancy featurephone than smartphone.

Not really. iPhone was the first phone ever that shipped with a real, full-featured, non-crippled web browser. This was an astonishing achievement at the time, and one which made its existing competition look like "fancy featurephones," not the reverse. (Really an astonishing achievement period, considering it had 128MB of RAM).


But no 3rd party apps. That was an after thought.


Absolutely not, it was added on after, but no, that was the plan all along. You don't build that in just a year. That was part of the plan all along, but why waste millions of dollars on an app ecosystem before the phone itself is proven? No, you start with an amazing minimum viable product, see if it succeeds, and if so, you recoup lots of R&D money, and pour that into building the app system you already planned out. Yes, Jobs talked about web-apps and such, but that was just cover.

Yes, Walter Isaacson said that others tried to convince Steve about apps at launch, but from the moment he started talking about web apps on that stage in 2007, I never believed for a moment it was really the angle. I knew a couple folks who worked on the first couple revs of iOS, installable apps were always possible, if underdeveloped, from day one. Jobs had lots of resources at Apple in the 80s, and frittered them away on the Lisa and Apple III. He stumbled on Pixar, not knowing where it would go, and had a hell of a time figuring out how to position NeXT, but all those failures taught him that in business, like in art (and we know he felt himself an artist), making the most within the constraints of the medium is they key to success. He came back to Apple on its deathbed. He negotiated with MS for a transfusion to stay alive, and knew even though OS9 sucked, they needed a splash. They had the iMac. Pare down a personal computer to what was needed at the time. Monitor, modem/ethernet, CD drive. No need for a floppy, they're dying, chuck it for an external one you can charge for. No need to pack it with a super spiffy CPU or oodles of RAM, people can pay for an upgrade. Just make it slick looking and work well. Same with the iPod. Pimp it out with upgrades later, after the MVP proves its worth. The G4 cube failed, it never was really iterated on.

He learned from Microsoft, create a MVP, if it seems to catch on, iterate fast.


> But no 3rd party apps. That was an after thought.

Barely any platform had 3rd party apps. No one had a streamlined app store, SDK and monetization process like iOS came out with in 2008.


Palm Had. I was a fool not to try and develop a 3rd party app for it. I had a Palm.


http://mobilehtml5.org/ I'm interested in how you would define full-featured. Please check the symbian & opera columns. Also, iOS 1.0 ships with Safari3.0, not Safari3.1.1 in this test.

In iOS 2.0 they introduced a new feature that allows you to save web pictures to Photos. Full-feature redefined. :)

Edit: a full-featured television indeed, by Alan Kay's definition: https://www.fastcompany.com/40435064/what-alan-kay-thinks-ab...


I mean full-featured in the sense of end user experience. If you have any example prior to 2007 of a mobile browser rendering the full New York Times website perfectly,[1] I'm all ears. But as I remember the below link was, for good reason, the biggest "wow" moment of any demo Steve Jobs ever gave.

[1] https://youtu.be/RIRQg8AJxuw?t=41m24s


I partially agree with you, in that the rendering of the page is good-looking in circa 2007 standard on a mobile device. But rendering one page nicely does not prove it is full-featured. It has to go through some kind of benchmark, which reflects the general ability to process trillions of other pages out there. It loses to Opera Mobile or Symbian browser on the test I just googled (not sure about how accurate it is though).

A full-fledged browser experience in 2007 to me means at least I could have mouse hover, to deal with sites not yet adapting to mobile computing (there were a lot of them). WM6 browsers did that. If it fails, I'd go and use my Palm device to VNC into my workstation -- a 2004 Sony device that will be up-to-date forever because it is a decent thin client.

I also remember opera mini being a very handy browser on lower end phones like the S40 models. Since the first iOS safari does not do javascript IIRC, it makes no difference if the rendering is done with WebKit locally, or pre-computed on a server. The only difference is that iPhone has a bigger viewport, which allows you to consider the webpage a minified version of the desktop rendering -- and you are able to freely swipe, zoom, rotate -- not relating to the functionality of the browser itself. I'm not sure if you would agree, but I think, if iPhone1 runs Opera Mobile (with beefy 128MB RAM and fancy graphics chip), it beats the built-in Safari to the ground.

Of course it will cause other troubles -- battery life, thermal management, slow startup, or even unstability etc. This is, to my understanding, why Apple decided to ship a "reduced" version of Safari3.

Edit: adding explainations.


Screw mouse hover. iOS had the mind blowing pinch to zoom feature which made full page websites actually readable on mobile. They didn't even have to wait for mobile friendly. If you tried browsing the web on any phone pre-iOS it was a shit experience fraught with frustration.


Exactly. I was referring to exactly this, that it is the awesome interaction methods that made it work, not the browser itself.


iOS Safari has always supported Javascript and was never "reduced" in any meaningful way (that's the point of Jobs' demo) except that (as now) it didn't support Flash, and deliberately ignored onmouseover, :hover and other such features that don't work well with a touchscreen interface.


I assume the iphone was "more fancy featurephone" due to the lack of 3rd party apps?

I would have to disagree with that statement. Windows Mobile and BlackBerry allowed 3rd party apps to be installed, but they were both difficult to find and didn't usually add anything beneficial to the phones at the time. Users, for the most part, stuck to what was installed on the phone and that was it. Smartphones were defined by the fact they had an email client and a (relative to the time) high-resolution screen to read and write emails on.

It was a different market in 2007. The idea that a successful smartphone required an app ecosystem was unheard of.


I worked for a company in 2006 that was considering writing phone apps (we already had a bunch of Windows apps). At the time, each phone company wanted to "curate" the apps for their own phone stores. At least one company wanted, for each $10 app, about $11 of revenue. The result: we decided that it would be essentially impossible to ever make money from phone apps.

What the iPhone did was genius: they created demand for the phone, but would only sell through phone companies willing to let Apple control the app market. That made all the difference: all of sudden, a developer could make an app and have it show up to bazillions of people.

[disclaimer: I currently work for Microsoft, but not in the phone team. But I do have apps in the Microsoft app store!]


Don't confuse the lack of modern mobile apps with any mobile apps. There was a thriving ecosystem around mobile apps at the time. Not only Windows Mobile and Blackberry but Symbian too, which I believe was the largest, and Treo.

There were many companies living on this stuff. Mobile data was still very expensive, which didn't change for a few more years, and touchscreens were small and crappy. So the market was mostly business logic and CRM apps because they were the ones that could afford it.

That changed when mobile data and big screens became cheap enough for consumers, but I think Apple was as confused about that as everyone else given the state of early iPhones.


The problem with WP was that it was late and offered nothing very special to consumers over android/iOS (and ya, I loved my 920). Consumers had no reason to buy it, developers had no reason to dev for it, a huge vicious circle that would have been difficult to break under the best of circumstances. The war was lost when the WinMo 7 team decided to after Blackberry in 2007, ignoring the iPhone as consequential, requiring that dev/design reset later that was just too late.


Man the more comments I read the more I begin to remember. There was ONE dev who was churning out VERY high quality apps to popular platforms. I think Snapchat or Instagram was what he got known for. Instead of MS embracing his work and helping it flourish, they let him get taken down by a C&D.

LOTS of people got heated when that happened.


Rudy Hyn


That’s the one


>Had Microsoft put quality dev teams on building high quality third-party apps I think the chances of success would have greatly improved.

For one app? For an app they would have to give away for free? For an app that would always be behind the FB built ios/android apps?


You have to consider this in the context at the time. Yes it would have been expensive, but Microsoft was investing BILLIONS into Windows Phone. Microsoft and partners spent something like $700 million dollars just on marketing for the launch of Windows Phone 7[1,2].

To spend that kind of money on marketing and then not dedicate resources to the actual product seems foolish. And I am not saying they should have done this for only one app. I am saying they should have done this for many apps. If they had created quality versions of, say, the top 25 apps for mobile at the time they would have been in a much better position. I believe they could have made significant traction with business users. Remember, at the time Office wasn't available on other platforms and was (is) a huge draw for many people.

If they had been successful with the strategy and gained market share the partners would have wanted to take over their own apps anyway to enable monetization. But they needed users for that and to get users they needed apps. You have to jump start it somehow.

Now, would it have made any difference? Who knows. But IMO, you either need to not do it or you need to do all parts of it right. You can't go half way on the ecosystem and expect to succeed in an already challenging market.

1: https://techcrunch.com/2010/08/26/microsoft-half-billion-dol... 2: https://techcrunch.com/2012/01/04/microsoft-oems-pledging-20...


    To spend that kind of money on marketing and then not dedicate resources to the actual product seems foolish.
Sounds like a Hollywood strategy to me. Overadvertise a stinker to try to recoup your investment.


That works for movies because they're trying to maximize the number of people who are interested enough to go see it once, more or less. A successful phone ecosystem requires building something that people want to use over the medium term.


And the expense and commitment of buying a phone is much greater than the cost of taking a chance on the Bearded Lady.


but marketing was always a core M$ strength. it's been a standard criticism of the company for a very long time.


They were great at getting their OS preinstalled on nearly every PC. They were really good at backwards compatibility, and they were absolutely ruthless against their competition. But they always, always sucked at marketing.


you don't view "getting their OS preinstalled on nearly every PC" as marketing?


Nope. Windows 95 had a huge promotion, but most PCs already came with the OS installed, unless you built your own PC.


They did manage to get many of the top 50 apps to their platform, however, top 50 isn't enough. When all your friends have the latest and greatest on their iOS and Android and you have to wait a year or two for a WP port you get tired of that. Plus there are many industry-specific and workplace apps that never made it to WP. You can only face so many let downs in the app store before you give up on a platform. Nokia did make some damn good hardware though.


> the top 25 apps for mobile at the time

This list keeps changing every month. Remember Pokemon Go?


6 million daily active users remember Pokémon Go and spent $1B on it in 2016.


For a small number of core apps. If they have invested heavily and put their best engineers to work on high quality core apps like Facebook, Instagram, Twitter, Messenger etc (maybe it would be 15-20 apps 90% of people install on their phones), they would have had a much better shot at gaining momentum.

Other smaller apps would have followed and been made by independent developers but you need to cover the apps almost everybody is using and make them comparable feature and quality/performance wise to iOS and Android versions.

Number one thing most people do on their new phone is download Facebook/Messenger/Twitter. If those apps suck they will immediately have a very bad impression and will switch back to iOS or Android as soon as they get a chance.


This was the strategy that Apple followed when OS X first came out.

Third party developers were moving slowly (or not at all) so Apple started developing and giving away (or selling) apps that showed off what you could do with the new platform.

They developed Safari when Microsoft lost interest in further development of Internet Explorer. The iLife suite had iTunes, iCal, iMovie, iPhoto, iDVD, iWeb and GarageBand. The iWork suite had Numbers, Pages, and Keynote. They created (or bought) professional apps like Logic Pro, Final Cut, Shake, Motion and Aperture.

If you have a new platform and third party developers don't step up, then you need to start filling those holes yourself in a way that shows off your platform's advantages, and keep at it.


On the other hand, if all WP has is half-baked clones of better apps on Android/iOS, there's even less incentive to switch over.

If they were serious about growing the user base and building these apps internally was their only course of action (seems like it was) then it should have been taking more seriously (assuming parent is spot on here, I have no idea really.)


Not just one app. They would need Facebook, Instagram, Snapchat, Twitter, and a number of other staple apps.


Usage numbers for FB are not that far behind IE. Perhaps they should have invested a proportional amount.


Intead they staffed their quality dev team on the Windows Mail app and Skype...


At least for European customers. This won’t help everyone.


How is "European customer" defined?

If someone is not European but resides in Europe, do they have this right?

If someone not from Europe travels to Europe and issues the request while inside Europe, do they have this right? (If so, would using a VPN work?)


If that's actually the case, I can see a Deletion-as-a-service business. "We thoroughly delete your account for you."

The fact that I have to come up such mechanisms is a damning indictment of the way things are.


I really don't get why AckSyn's comment was marked dead.


There used to be a service like this for facebook and twitter, then those two services blocked them from doing so.

Which was sad, but it ultimately worked well for everyone who ended up using it before the ban. It would untag you from photos, delete your comments, wall posts, photos, information, change your name, and issue a deletion after changing your password (and emailing that to you).


I recently went to a GDPR seminar and the presenter claimed that GDPR applies to everybody physically in the EU (even tourists while they are passing through).


GDPR = https://en.wikipedia.org/wiki/General_Data_Protection_Regula...

> a regulation by which the European Parliament, the Council of the European Union and the European Commission intend to strengthen and unify data protection for all individuals within the European Union (EU). It also addresses the export of personal data outside the EU. The GDPR aims primarily to give control back to citizens and residents over their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU.

Emphasis mine.

Full text at http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:320... .

> The protection afforded by this Regulation should apply to natural persons, whatever their nationality or place of residence, in relation to the processing of their personal data. This Regulation does not cover the processing of personal data which concerns legal persons and in particular undertakings established as legal persons, including the name and the form of the legal person and the contact details of the legal person.

The specific clause concerning customers appears to be:

> In order to determine whether such a controller or processor is offering goods or services to data subjects who are in the Union, it should be ascertained whether it is apparent that the controller or processor envisages offering services to data subjects in one or more Member States in the Union. Whereas the mere accessibility of the controller's, processor's or an intermediary's website in the Union, of an email address or of other contact details, or the use of a language generally used in the third country where the controller is established, is insufficient to ascertain such intention, factors such as the use of a language or a currency generally used in one or more Member States with the possibility of ordering goods and services in that other language, or the mentioning of customers or users who are in the Union, may make it apparent that the controller envisages offering goods or services to data subjects in the Union.

Regarding non-resident visitors, I can't tell. The directive says "whatever their ... place of residence" so almost certainly yes. It also contains text like "data subjects residing on its territory" so you might have problems lodging a complaint if you aren't a resident.


The US needs to grow some backbone and pass equivalent privacy (and antitrust, while they're at it) laws. If something is a certain way in both the US and EU, most tech companies won't bother doing any differently worldwide.


It isn't a matter of backbone. Legislators are simply representing the interests of their customers, such as Experian and Facebook.


This seems incompatible with democracy.


We can blame/credit the Citizens United Supreme Court decision.


I guess the problems started much earlier, and that court decision was actually a result of such issues.


GDPR refers to any business that transacts with the European Union, so businesses outside the EU (and the UK) are included.


Bed bath and beyond too. We ordered some cleaning product on Amazon that was delivered by BBB recently so I checked the price. We paid 3x on Amazon.


If someone can't be bothered to spend 10 minutes writing a cover letter for a job they want then I would consider them unqualified. I don't want to work with somebody who refuses to put in a bit of effort to get what they want.

I spend at least a few hours writing job posts. Why shouldn't I expect the same?


I don't understand this. I'm a developer, I create products. I'm not a professional writer. For me this means you value >>presentation<< more than my skills, despite I won't be a professional writer in your company...


I'm really not talking about something lengthy. But I get 100s of applications for every job post I publish. Most people who apply are frankly not qualified. If you are qualified, write a few sentences. "Hey, I saw your post for [...]. I thought this looked great for me because I have worked on [...] and I am super excited to learn more about [....]".

I am not asking for paragraphs, but show me you are interested. I honestly spend hours writing and reviewing the jobs I post and it takes probably 100+ hours total from a team to hire a person. We want to hire you, but you've got to help us. :)


I think hiring is completely broken. At least in IT. Developers don't understand HR people, and HR doesn't understand developers.

For me "Hey, I saw your post for [...]. I thought this looked great for me because I have worked on [...] and I am super excited to learn more about [....]" means absolutely nothing. Frankly, I do NOT know you or your company and in a job description there is nothing about the team, culture etc. But I have no other option just to make something up, because otherwise no one'll hire me.


Where I work, engineers hire engineers. You are right on some level that it is hard to know exactly what a company does, but for many jobs in software that isn't entirely true. Read the companies blog, look at their open source projects, use the product, and most importantly read the job posting. I really don't think you'll have to make something up if you really are a good fit for the role.


It's working really well when engineers hire engineers. I've no problem with that. They don't ask questions like "Why do you want to work for us" or "What are your strengths" or "Tell me about a situation where ...". Not every company has a blog, open source projects and many of them have no public products and are privately held. So, I can see the job description and the website. I don't know if you are an HR person or not. No offense. I'm not against them, but have very very very bad experience with them.


But you don't customize the job post to each applicant, right? So "the same" would be a developer spending a few hours writing a cover letter and resume that are sent to a bunch of potential employers without customization.


It is more common for the applicant to apply for the job so the job post has to be generic. Now if the company is recruiting someone unique they will approach them and tailor the opportunity to that person.

Also I don't think one has to spend a few hours to customize the resume. Make one for each industry you are considering and spend 5-10 minutes tweaking it for the employers you really care about.


We do for the most part. We might hire a number of people from the same position, but we typically write a post for each type of position or team.

Even at Microsoft this was often the case - of course there was a generic template, but you would put specifics to the role.


Op did spend a few hours: he wrote a script. By definition he did what a programmer should do: find a solution that is automizable, then make it work.

If that annoys some people, well, that is why you apply to more than one job.


Good point. And I would seriously consider an applicant that wrote such an application, but better than putting it in his resume would be to write a quick note. "Hey, I've been looking for a new role. As part of my process I wrote this app. See it here and let me know what you think." That would get somebody noticed.


I pretty much disregard every application that doesn't have at least a basic note explaining why you want this particular job. Maybe getting a job is a numbers game, but I don't agree that getting a good job is.


That is what I did when I was looking. Turns out that takes less than 10s if you have their website - "I have always been interested in $TECH_THEY_USE and would love to work on a team that works on $WHAT_EVER_PRODUCT_THEY MAKE because it sounds like a challenge". That wasn't exactly the words I used, but close.

Then you include the name of whomever is the recipient, because people love to see their own name and feel important.


100%. My company does too, and in many cases, we ask applicants to include a secret phrase (usually "Welcome to the World of Tomorrow") to indicate that they've actually read the job description. I think if you're going to look at it as a numbers game, you're going to be better off sending 10 quality applications than 100 shots in the dark.


Another option, why not find a startup that you can join remotely while traveling? Unless you really want to not work for 6 months, working during the week and exploring on the weekend while living anywhere might be a nice balance. You'll get experience that will help when you find what you want to do on your own. From a life standpoint traveling for some time is a great experience. From a career standpoint, having a 6 month gap in your resume at such a young age could be an issue. I hire a lot of engineers and to be honest if I saw somebody that worked at google for 2 years and then took 6 months off to travel I would really worry about their commitment. 2 years is about the absolute minimum you should stay at one company. Moving any more often tells me, as a hiring manager, that I risk taking a bet on you, training you, and then having you leave right when you actually start adding value. Obviously, this assumes you need to get another job if your startup doesn't work, but that is unfortunately the most likely outcome.


Interesting podcast with Instacart's founder where he talks about an investor who turned him down because the same investor lost money on webvan. https://www.npr.org/player/embed/523003162/523047374


Auth0 | Seattle, US; Buenos Aires, Argentina; Remote | Full Time | https://auth0.com/jobs

We are hiring many positions. Check out the posts on our site.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You