For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more cHalgan's commentsregister

My understanding of all these GAE pricing story is that Google decided that GAE is not a strategic business and that business unit need to break even or they will be canceled. Does this make sense?


If that were the case, wouldn't apps like google calendar get cancelled long time ago? I doubt the ads on google calendar are sufficient to sustain the number of users it has on a daily basis.


Maybe Google Apps (Calendar) are part of strategic vision (connected with Chrome OS)?


Nope, Google doesn't work that way. No business units get P&L's, it's part of their overall financial strategy.


My father has saying: running business is a marathon. So from that perspective first 100s of meters (cloning the initial version of a web site) means nothing.


I'm not sure, but if you want to build a world class team you need to pay more than market rate + excellent team/work environment + more responsibility + more impact on world + give equity. There is no free lunch. Really.

Did Facebook become successfully because they went cheap with hiring VP of engineering early in their game? (answer not: they recruited the top)

Yes, you can get lucky and build a successful company by hiring people which are fresh from college for less than market and dream about being rich.

The point is the following: DONT HIRE BAD DEVELOPERS.

Unfortunately, good developers are good in math and they were around so they will not go with salary cut + questionable equity stake. Yes you can get lucky but there are so many other unknowns when you run a business and you should limit unknowns to the minimum.


He is right. 3.45T are total costs of bailout. $2T Emergency Fed Loans (you can give shit-sandwich and exchange it to get real $$). $700B TARP. $300B Hope Now program. $310B Fannie/Freddie and AIG. And ya... we also added $140B Tax Breaks for Banks.


I have some storage experience and I can tell you that Dropbox is not a new idea at all. Even AOL had something like this. I'm also convinced that they didn't win because of simplicity. However, Dropbox just delivered what they promised.

It was amazing for me to watch companies in "cloud storage" space just not delivering. Their servers were down, clients were using 99% of CPU, etc. Yes it is ok that service has some bugs but these companies were adding features (mobile app, integration who knows what) before even making basic stuff working.

Just look Google Docs. Why they cannot just make that "documents" don't intermittently disappear (you get 404 but after retry they are back) before they change interface. Why? I really don't know.


We were developing data stream processing software for companies in financial sector. We were unable to get performances and controllability/measurability of the system our customer wanted using java. So we end up with C/C++.


I do high-frequency trading. My feed handling and order routing logic is in C++ for the performance benefits. But my backtesting is in q/kdb+, my loading scripts are in Python, and my administrative tasks are in bash.

Some of my co-workers use Java because their models aren't as sensitive to latency as mine. My best friend does options pricing in VB/Excel. And I know tons of competitors who use R, MATLAB, OCaml, and Haskell.

There are tons of languages used in finance.


Could you please give me a specific example of how you use Q (something more towards the language side, rather than the database side)? Also: How did you learn Q? I've been playing with it, and I've read "Q for Mortals", but I'd like to do more, although not in the field of finance. I find Arthur Whitney's journey from APL to J to A+ to K/Q quite interesting, and I'm trying to figure out just how powerful Q really is. Thanks in advance.


I work on a high frequency trading desk at a large bank. We do not use q for our trading models but just like the write of the parent comment, we use it for back testing / regression testing our models. Our quants also like writing / specifying the algo model in q and I or other developers would implement it in the language of our trading system.

IMHO, there are many resources for learning q on code.kx.com . I also went on a training course arranged by First Derivatives (they are the only vendor who offer formal training in q). I would say the best way of learning it is to practice! Use code.kx.com as a reference and download an evaluation copy of the runtime. You should be able to find open source / Free editors for q (QInsightPad, there is also an Eclipse plugin). Set up your environment and try to tackle the Project Euler question set in q.

Alternatively, get a linear algebra / machine learning text book and attempt to solve the exercises in q.

q may seem a little terse, but it is extremely expressive and once you get the hang of the syntax and error handling, it is a joy to use.


> I find Arthur Whitney's journey from APL to J to A+ to K/Q quite interesting

Arthur Whitney didn't do APL or J; those were from Kenneth Iverson with Roger Hui helping out on the later. A+ was Arthur's implementation of APL, from what I understand. K is entirely ASCII (none of the special APL characters) and q added reserved words plus the integrated kdb+ database.

> How did you learn Q?

I learned q as a quant for a trading desk that used it for most tasks. I've been using it ever since because it's very expressive and has great performance.


"Arthur Whitney didn't do APL or J; those were from Kenneth Iverson with Roger Hui helping out on the later."

I am familiar with the history. I meant "journey" as in "progression through APL and APL-like languages". Iverson showed APL to Whitney when he was only 11 years old. Whitney created the first version of J, but then moved on, leaving it to Hui.

"Work began in the summer of 1989 when I [Ken Iverson] first discussed my desires with Arthur Whitney. He proposed the use of C for implementation, and produced (on one page and in one afternoon) a working fragment that provided only one function (+), one operator (/), one-letter names, and arrays limited to ranks 0 and 1, but did provide for boxed arrays and for the use of the copula for assigning names to any entity. I showed this fragment to others in the hope of interesting someone competent in both C and APL to take up the work, and soon recruited Roger Hui, who was attracted in part by the unusual style of C programming used by Arthur, a style that made heavy use of preprocessing facilities to permit writing further C in a distinctly APL style. Roger and I then began collaboration on the design and implementation of a dialect of APL (later named J by Roger) ..." - from Hui's "Remembering Ken Iverson", referencing Iverson's "A Personal View of APL" (http://keiapl.org/rhui)

In Appendix A, on that same page, you can find Whitney's code. It shows how differently he thinks about coding, and is likely a good example of Q's roots.


Exactly. All these claim memory is "cheap" are misguided. The hardest thing is optimizing your program for appropriate usage of non-CPU resources (inter-processor communication, memory, disk, etc.).


The C/C++ is just faster. You can tune it. You know what is happening. Also very important thing with C is that you can have a clear control how much of resources your program uses. And the most important resource is memory.


Memory? Really? At ~$7 per Gigabyte of RAM I doubt that.


I don't know how relevant this is to computational finance, but in scientific computing you can use up an arbitrary amount of RAM in dimensionality-cursed fields. For instance, in neutron transport theory, the governing equation is in 7 continuous variables (or 6 in steady state), so that if you want your mesh to be twice as fine, your memory usage increases by 2^7 (or 2^6).

And, just so we're clear, that's multiplicative, not additive. So you're using 2^7 times as much memory as you were using before.


So memory costs grow linearly. Does your memory usage? If not, then any linear reduction of memory usage gained by switching languages can be very significant. It can also be the difference between being able to run a job on your 64GB machines, or needing to petition for more. Memory only increases at $7/GB so long as you can fit it into the rest of your hardware.


I think he was specifically referring to cache. You still take a large performance hit to have to go to main memory, and iirc nehalem has 12MB cache.


Actually this is the end of news papers and any form of newspaper conglomerates on internet. Things like Techcrunch, HuffPost are transitions toward the new model.

It seems the future is that consumers will want to read content written by a particular writer not from some conglomerate site. That writer (blogger, scientist, professor, etc.) will be able to monetize.


You make an excellent point. I used to read Joel Spolsky and Russinovich's blog quite often. I would actually pay to do this.

The main problem (and difference) I see with Blogs vs traditional journals is the editors. The majority of blogs from technologists, scientists and the like do not have an editor at hand. This reduces the writing quality of their articles.

Now may be a good time to setup a business connecting "editors" with writers.


The following stroke me as a very strange: "Hurd borrowed heavily from the founders’ playbook". Really? How? Where? Why? WTF?

Does any of HP-alumnis here thinks that Hurd was a "HP way" CEO?


The assumption of "alumni" status amuses me, somehow. :)

But the answer is no.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You