For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | lprez's commentsregister

I would be happy if this became a trend.


so, will this be the mobile equivalent to bootstrap?


AFAIK, this is just an attempt to put Twitter products in mobile apps. Nothing revolutionary here


So they can arbitrarily screw them over later?


It was my initial thought, but this is a little different. It's to use in an existing app and not building an app for twitter. At least that's what I got.


As much as I'd love that display, I can not believe that a $2500 computer is shipping with 8GB of RAM in almost-2015.


I can't believe you can't believe it. Have you heard of Apple before? If anything I'm a bit more perturbed that they max out at 32GB of RAM. My aging Mac Pro has more than that in it.

I imagine that they preferred shipping with a default 8GB and selling a $200/600 RAM upgrade than having a higher headline price.

(Incidentally, their 1GB phones still outperform phones that are one year newer with 2GB of RAM, double the cores and nearly double the GHz, so maybe they're onto something.)


32GB is a limitation of iX Haswell, Ivy Bridge could use 64GB - Apple can't do anything about it. Let's hope Intel allows it for Broadwell-K (-Y,-U,-H variants will be restricted to at most 16GB)...


It's not a Haswell vs. Ivy Bridge distinction, it's a consumer vs. server distinction. Consumer CPUs only have dual-channel memory controllers and Intel's DDR3 controllers can't use 16GB UDIMMs or have more than 2 UDIMMs per channel. The server chips (a handful of which get some features cut off and sold under the i7 brand) have 4 memory controllers and also support RDIMMs and LRDIMMs.


>Intel's DDR3 controllers can't use 16GB UDIMMs

That is a software not a hardware limitation (in particular, I think it is in Intel's MRC). ASUS has firmware with modified MRC code that can support it.


That is in fact incorrect. Consumer desktop Sandy Bridge, Ivy Bridge and Haswell all support 32 GB of RAM via 2 DDR3 channels. Enthusiast desktop processors (Sandy-E, Ivy-E, Haswell-E) support 64 GB, because they feature 4 channel memory controllers.

Once the consumer chipsets move up to DDR4, you will have support for 64 GB of RAM on the desktop. http://en.wikipedia.org/wiki/Skylake_%28microarchitecture%29


Haswell (as opposed to the pseudo-Xeon Haswell-E) caps out at 32GB.


I don't think this is supposed to align against Mac Pros (plus what people are already saying with the chipset memory limitation)


It seems to me that a top-of-the-line consumer machine should be able to compete with a 3 year old workstation (that hadn't received a significant update in two years -- effectively 5yo hardware).


Why aren't you comparing your old mac pro with a current mac pro?

12GB of RAM (up to 64GB) https://www.apple.com/mac-pro/specs/

The mac pro line always had xeon processors.. so it seems odd that you would expect the same memory support on a desktop.


What 3 year old workstation are you talking about?


Read my original post.


Ok - so just an arbitrary old machine you picked. Somehow I thought there was more logic to the choice of benchmark that I was missing.


Thanks for the info -- didn't know Haswell has that limit. (I guess that's one of the benefits of buying pro stuff.)


For what do you need more then 32G RAM ever on a desktop (excluding time travelling forwards). If you are asking yourself that question you need to get a proper server or rendering farm.


You'd be surprised how fast you can run out of memory if you are mastering your own, non-professional 5k/4k video and applying some advanced non-linear operations such as video stabilization. It's not GPU accelerated, parallelism is limited by the nature of the algorithm which makes rendering farms useless. Do you need a power hungry and noisy server with slower single thread performance than a top end i7 just for additional RAM?

On the other hand, beyond 32GB, ECC seems to be very important so you probably won't have any choice but to buy a proper workstation unless AMD makes a miraculous processor on par with i7 (as they do support ECC in all models).


> non-linear operations such as video stabilization

What do you mean by non-linear? Are you referring to an equation containing a linear combination of the dependent variable and its derivatives, or something else?


Usually a non-linear video operation would mean one that doesn't just process each frame of the video beginning-to-end, like moving clips around.

Stabilization seems pretty linear to me, though.


Virtual machines behaves best with as much ram as a normal machine. Assuming you have 3 machines running, osx host and windows+linux vms and you want to give them 12gb each you are already at 30+. Might sound like a crazy setup but some times a compiler is only supported on one platform and then you want a replica of your clients server in one VM etc etc.


No, not really. I work in a games studio and our usage on individual workstations goes above 32GB when compiling our project. Due to technical and licencing issues it's also not something that can be relegated to a remote server farm for compilation.


If it's that gigantic, then a distributed build on local servers makes a lot of sense. Much faster than each developer doing it on their own single workstation.


We do distributed builds on local workstations across the studio with Incredibuild. That's the only way we could get away with licencing for consoles. That reduces the build times(from over 40 minutes to less than 5) but the RAM usage is still very very high.


Im guessing this is C++ with a lot of template abuse and header interdependencies? C++ has a powerful compiler, buts its also easy to shoot yourself in the foot with it.


Yep. Also the fact that the project cross compiles on 7 different platforms from the same code base.


If you didn't need the 5k display but you did need more memory and rendering capacity, the Mac Pro has you covered maxing out at 64 GB of RAM or 128 GB is you use non-Apple RAM upgrades. So there's that.


Think of it a a $2500 5k monitor shipping with a computer for free


Unfortunately you can't hook up a powerful workstation to that monitor :/


[deleted]


TDM is likely to break with this new machine. Thunderbolt 2 only has a theoretical max throughput of 20Gbps... Driving a 5K display @ 60Hz would require 98.88% of that throughput, which is unlikely to be sustained in IRL usage.

I'd be all over this if Apple can confirm that the monitor is usable by external machines.


If this holds true you should be able to: http://support.apple.com/kb/ht3924#12


I don't think that will work here - TB2 is derived from DP1.2, which only goes up to 4K@60hz. You'll have to wait for TB3 to drive an external 5K monitor from a Mac.


Like I have posted elsewhere already: 5120x2880 = 14745600 pixels

14745600pixels x 3 colours, 1 byte each = 44236800 bytes

44236800/1024/1024 = 42.19MB each frame

42.19MB per frame x 60 frames a second = 2531.25MB.

TB2 has a bandwidth of 20Gbps = 2560MB/s

So yeah, theoretically TB2 has enough bandwidth. But it's a very tight fit.


You're ignoring signalling and overhead. The actual usable bandwidth is about 17.3Gbps, thus not enough (for 60fps uncompressed stream).

DisplayPort v1.3 can deliver 25.92Gbps of usable bandwidth, thus required for 5k [1].

[1] http://www.extremetech.com/electronics/190130-displayport-1-...


You should divide 20Gbps by 9 bits (2275 MB/s) or even by 10 (2048 MB/s), not by 8, because in addition to useful payload, there are also packet headers and control packets being transferred. So it actually doesn't.


"Packet headers and control packets" are not the reason you should divide by 10. The reason is that Thunderbolt uses 8b-10b encoding so 1 byte of data is transferred as 10 physical bits. Therefore the maximum theoretical usable bandwidth is 20e9/10/1024/1024 = 1907 MiB/sec. Then on top of that you have to deduct the overhead from packet headers and control packets so the real-world usable bandwidth is even less than 1970 MiB/sec...


I believe 2560 is the post-coding bandwidth. Links are usually 3.125 Gbps before coding.


Doesn't TB2 have two interleaved streams, one carrying DP1.2 and the other carrying 20Gbps PCI-E? Even if the PCI-E stream has enough bandwidth for 5K in theory, the ecosystem isn't in place to use it as a display output. That's what the DP part is for.


For other people to play with this, just google "(5120 * 2880 * 3 * 60) bytes/s in Gib/s" to get the answer (just short of TB2s 20GiB/s assuming no overheads.


At some point you could use the iMac as a screen can it still me done ?


I would imagine as long as the video card has similar specs as the new iMac, and the computer had Thunderbolt 2, then yes (at 5k). Otherwise, it would be scaled somehow or just not work at all.


Computer for free? With Mac everything costs money, there is hardly no way to enjoy this piece of hardware without spending money to the Mac way of money sucking.. I enjoy my arch linux system every single day, and it never cost me a cent, there is no backdoors, money sucking thing going on at all. But, for those who believe the marketing slogan that everything with Mac is better, well, grab your wallet and join the idiots.


I have 8 GB of memory on my 2013 13" MBook Air - I run 23 separate applications simultaneously, including the full office Suite (Outlook/PowerPoint/Office/Excel), VMware Fusion w/Windows XP + OpenBSD running, Dynamips Routing Simulator w/10 Cisco 7200 routers, Google Earth, Aperture, Pixelmator, etc.... No swapping. Everything runs fine.

OS X (in my case, Mountain Lion, so an old version) - does really, really well with shared libraries. 95%+ of consumer users (not pros, whose needs are obviously more rarified, and for whom the sky is the limit) are probably fine with 8GB of memory, and it's not something I would recommend them increasing unless they plan to hold onto the computer for more than 5 years.


Yosemite beta 6, just booted with only Firefox and this HN page open takes already 2,8GB. Yosemite is really memory hungry compared to Mavericks in my experience. And future version are going to get hungrier. Those 8GB are going to fall short within the next 2 iterations of OSX, an after paying $2500 that feels ridiculous to me.


> Yosemite is really memory hungry compared to Mavericks in my experience.

Which numbers are you reading for this? The "memory use" total includes lots of irrelevant things like files read once during boot and optimistically cached for later, so you can expect it to be nearly 100% of installed memory at all times.

You're not in a memory pressure situation until you have an actual performance problem or the swap space starts growing.

But if you are having performance problems, I'm pretty sure someone cares deeply about them and would like to know the per-process memory use.


Mountain Lion 10.8.5 - fresh boot with only Chrome and this HN page open - Used: 2.83 GB.

Once the shared libraries are in memory, memory usage grows really, really slowly for my application set (listed above) on OS X.

I'm not saying that 8 GB will last forever - I'm just saying, as a power user who has zero need for more than 8GB right now, that 8 GB on OS X will be just fine for about 95% of average consumer use. Absolutely would not recommend upgrading an iMac's memory unless you are really certain you are going to hang onto it for more than 5 years - not the best use of $200.

I'm just talking about your average family use; for vertical niches (Video Work, Graphics Design, Heavy Industrial Programs, Databases) - obviously those people will assess their situation and make a decision that is optimized for their circumstances.


Memory does not work the way you imagine it does. OS do not optimize for keeping memory empty.


I don't think their target audience sees it the same way.

Most importantly, technology/creative professionals that care about the ram and the apple platform will gladly pay the premium for 32gb.


I agree, the specs are fairly weak, makes me wonder if the base model can really handle 4k video editing.

That being said - the overall price considering it includes a 5k screen is pretty damn good.


Considering Dell sells a display of similar quality, I can believe it.


At least it's user-accessible. Pop in 2x8GB for $150 and you have 24GB total, rather than paying Apple $200 to get 16.


is it upgradeable or the chips are soldered within the mainboard?


27" has always been user-upgradable, new 21.5" design isn't


In Mac at least you can't just do that, you need to create yourself a certificate and sign GDB. Took me a while too.


Just an FYI, you can use MacPorts to `sudo port install gdb`, and it installs gdb as 'ggdb' without the signing mess. To use with ddd invoke as `ddd --debugger ggdb` and it works like a charm.


thank you!


Poettering might not be the best leader or the best team player, but I don't think pointing at Linus for this is off-limits. Yes, the internet is full or anonymous idiots and that's not Linus' fault. But as leader of the community, the tone he sets is certainly not a good example, all the contrary, many will feel empowered to be assholes to other community members.


I'm really not happy with how eager people are to go "well, that's just the internet, it's a wider problem, not our problem". Yes it is their problem, because it affects their community and the people they work with, and even if the problem did not originate within them personally, it's up to them to take a stance against it instead of using it as cover for their own criticism of Poettering.


Do you read lkml or just the media reports of Linus' cursing?


A real direct competitor in DO and Linode is Vultr.com

My experience with them has been positive. They have slightly higher specs and a good variety of locations.


Anecdotal web hosting recommendations are worse than useless and everyone seems to have one regardless of how qualified they are to give it.


I have used Vultr's $5/month service for at least the past 6 months. It has only restarted once unexpectedly, and the CPU performance is quite impressive for that price.


This has been significantly improved in recent versions. As a young language, Go's runtime still has a lot of room for optimization.


Any links supporting progress? It has been two years. I know the community had major optimization plans. I have not been following it of late.


I wonder for how long will Google be able to keep its version scheme for the Nexus program. I guess Nexus 6 and 8/9 will be the end of the road.


It's sad that those photoshop skills in the cover really ruined my will to look at the document.


I get that is probably a WIP, but you need to get yourself a proper homepage with some explanation of the product and some screen captures. Otherwise I believe nobody is going to just sign in based on your explanation of what it does.

I'd recommend you to set up a good explanatory homepage then submit it again.


Sure, will setup a homepage. I have just started with this and building MVP. I'm here to get some early feedback on how other startups are managing this.

I have just setup this small MVP for a closed community and it solves their problem of managing contacts.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You