> but if each user action that triggered a one cell update becomes an event I have to keep forever, I see my database size exploding
I think there might be a balance here: you can always garbage-collect events that are already merged in an updated value of the given object. Depending on the requirement, this GC can be done e.g. after days or after months of that merge...
Doesn't having an "updated value" imply mutable state? If I have some, why not have it all? Having both mutable state and a complex folding operation to maintain it is going to give me multiple sources for the same information. Which one will be authoritative? I'd expect a compromise to be worse than either extreme.
I assume you have 'events' that you will 'merge' together in a single state object (in case you want to display something). So the operation is to fetch every related event, merge, display.
Now the 'folding' can be defined as snapshotting the 'merged state'. Instead of fetching 10 events, after the folding + GC, you will fetch e.g. 2 + the folded one. You are saving some CPU and bandwidth over time and that's it.
"Comprehensive Counter-Terrorism Act of 1991"
"Sponsor: Sen Biden, Joseph R., Jr. [DE] (introduced 1/24/1991)"
"Cosponsor: [...] Sen Reid, Harry [NV] - 1/30/1991"
"It is the sense of Congress that providers of electronic communications services and manufacturers of electronic communications service equipment shall ensure that communications systems permit the government to obtain the plain text contents of voice, data, and other communications when appropriately authorized by law."
What is specifically objectionable to you? The quote you provided seems to be the only substantive reference to electronic communication and extremely toothless/redundant. (wouldn't telcos be assumed to be compelled to release "lawful" requests by default?)
It's not about questioning the law, it's about uncovering the internal works of the law. My previous readings pointed to this bill as one of the earliest regulations that was used by everything in plain text. A nice way to describe backdoors without saying them explicitly, especially if the product claims to be secure from snooping. But what message does it send that the law orders you to screw your customers?
This specific bill might have died (I'm sure the patriot act superseded it), but every NSA-related revelation (and the FBI ones ~3 years ago) point in a direction that something similar is still in effect.
Not entirely true. You can always offer to provide them electricity at a lower cost than they would get from the nuclear plants. There is a cross-country marketplace for electricity in the EU, and probably nobody will stop you to sell such thing to Japan.
> "Which might explain why some people describe their purchase as watery and bland."
This is true.
The other part of the explanation is that many people just don't appreciate the coffee that is anything different than what they are used too. Just watch someone from the US in Europe drinking an espresso, or the other way around. People have different opinions based on their personal habits, and saying it is "watered down" should mean nothing without context and references.
I don't think that's it. I drank a cup of civet coffee in bali which was watery and bland. I also bought beans from a producer in medan who went on a lot about the scam that is most wild civet coffee. That one was rich and chocolatey.
I'd love to see your business plans, upcoming negotiation agendas and planned budgets. And I am not even your competitor, as I don't know what business you are in. Imagine your competitor's eagerness for such data.
And I'd have no objections to sharing such data. It's simply not the competitive advantage in most fields. Execution and effort drives success, not business plans.
I am a big fan of postgresql, but I am always having issues with the cluster configuration part, e.g. failovers, automatic new master election and all these crazy administration about the wal shipping. Was there any improvement on this either in the code base or by a 3rd party? What do people use to handle larger postgresql clusters?
As a contrast, I really like Riak's ability to just work with one node down...
The one big improvement happening in 9.3 is that synchronizing with a new master has become much easier. In earlier releases you had to more or less either use file based wal archive logging or you would have to rsync your whole data directory over, now the built-in replication protocol can handle this for you. We also got pg_standby now which is all you need to set up a new slave.
But yeah - if you need automatic failover and master election, you still need third-party tools. Some have had success with pgPool as a out-of-the-box solution (I haven't. I had severe reliability issues with pgPool. You might be more lucky), others produce their own scripts.
The process isn't complicated, it just requires you reading a lot of manpages and thinking ahead, but once you got the process down, postgres itself is reliable enough that its (admittedly limited) tools just work (which is a very good thing).
As long as Postgres doesn't do master-master replication, failover will always be a complicated topic to deal with.
I agree - one of the areas it would be great to have some development work done around is making replication administration simpler and easier.
One feature which would be really nice to have is the ability to do a manual switchover, ie making the existing master into a replica and an existing replica into the new master.
Another poster mentioned repmgr which looks good but hasn't had a release in sometime ( https://github.com/2ndQuadrant/repmgr/blob/master/HISTORY ) with 2 new Postgres releases since, although there does seem some sporadic work on a new beta.
Postgres-XC is a niche market mostly useful for things like BI setups and highly scalable transactional clusters. It's a very complex solution to very complex problems. If you need it, you know about it.
> I think its reasonable that as a US citizen you're okay with the NSA being able to break crypto.
Because the world is divided to US citizens or the bad guys? C'mon, there is a non-US world out there, who are not bad guys, but who've just lost every respect for the US tech sector.
I think there might be a balance here: you can always garbage-collect events that are already merged in an updated value of the given object. Depending on the requirement, this GC can be done e.g. after days or after months of that merge...