You assume the stone is released along a circular path, but that isn't quite right. When you release the stone you extend your arm forward, changing the arc of travel.
I'm a big fan of Dustin, I think the programming interface electric presents is very compelling, and the benefits are really impressive.
The killer feature, for me, of electric is that it makes the network _transparent_ to me, the programmer, and makes solving complex problems concise.
I see electric as a kind of emacs - an excellent operating system that lacks a decent front-end.
Reactive programming is hard, and it really ought to be way more visual, and electric should give me those tools.
I find the frontend side of the electric somewhat obtuse, and it's easy to go off the rails, needing to understand the underlying reactive layer https://github.com/leonoel/missionary.
Why should I target this bizarre DSL when there are millions of React, Angular, JQuery apps out there that work, and would be too onerous to adapt to this style of programming, and too annoying to carry a second framework in prod (even if you bundle an electric app in your existing one, the runtime can be very big by web standards).
I would love to see a framework built on top of electric that interfaces with any existing frontend as easily as it does now for any backend (Datomic is just one target of many for the spreadsheet Dustin built). I may try to explore this idea myself.
But is the business of Hyperfiddle building instances of this new hypermedia, or the hypermedia system itself?
I don't understand exactly what Dustin is selling me, at the end.
Trying to sell this particular spreadsheet as a solution, or is it access to the runtime (electric 3)?
How am I supposed to buy into a new programming model if I cannot use it?
I find that this makes skimming lisp code much easier, because I can usually skip reading the bindings and just read the function name and ultimate expression and usually get the gist very quickly.
You might wonder how this is different than the example you provided, and the answer is because you could sneakily intersperse anything you wanted between your imperative bindings (like a conditional return statement), so I actually have to read every line of your code, vs in a lisp `let` I know for a fact there is nothing sneaky going on.
deps.edn is becoming the default choice, yes. I interpreted the parent comment as saying "you will see advice to use leiningen (even though newer solutions exist, simply because it _was_ the default choice when the articles were written)"
You still need API routes for stuff like data-heavy async dropdowns, or anything else that's hard to express as a pure URL -> HTML, but it cuts down the number of routes you need by 90% or more.
Except it wasn't $12B to farmers. Farmers have been squeezed by monopolies on both the supplier (seed, equipment, etc) side AND the buyer side (there is often only a single buyer of grain in an area, for example) to the point that most large scale American farms are struggling to make any profit, and service their sizeable debts. The $12B will immediately be transferred to these monopolies, it will not go to farmers.
Author is on the verge of having a Clojure epiphany.
> 1. You should often be using different objects in different contexts.
This is because "data" are just "facts" that your application has observed.
Different facts are relevant in different circumstances.
The User class in my application may be very similar to the User class in your application, they may even have identical "login" implementations, but neither captures the "essence" of a "User", because the set of facts one could observe about Users is unbounded, and combinatorially explosive.
This holds for subsets of facts as well.
Maybe our login method only cares about a User's email address and password, but to support all the other stuff in our app, we have to either:
1. Pass along every piece of data and behavior the entire app specifies
2. Create another data object that captures only the facts that login cares about (e.g. a LoginPayload object, or a LoginUser object, Credential object, etc.)
Option 1 is a nightmare because refactoring requires taking into consideration ALL usages of the object, regardless of whether or not the changes are relevant to the caller.
Option 2 sucks because your Object hierarchy is combinatorial on the number of distinct _callers_.
That's why it is so hard to refactor large systems programmed in this style.
> 3. The classes get huge and painful.
The author observed the combinatorial explosion of facts!
If you have a rich information landscape that is relevant to your application, you are going to have a bad time if you try modeling it with Data Objects. Full stop.
See Rich Hickey's talks, but in particular this section about the shortcomings of data objects compared to plain data structures (maps in this case).
> Option 2 sucks because your Object hierarchy is combinatorial on the number of distinct _callers_.
I kinda like that. Suppose we do something like `let mut authn = UserLoginView.build(userDataRepository); let session = authn.login(user, pwd)`. You no longer get to have one monolithic user object—you need a separate UserDataRepository and UserLoginView—but the relationship between those two objects encodes exactly what the login process does and doesn't need to know about users. No action-at-a-distance.
I've never used clojure, but the impression I get of its "many functions operating over the same map" philosophy is that you trade away your ability to make structural guarantees about which functions depend on which fields. It's the opposite of the strong structural guarantees I love in Rust or Haskell.
> you trade away your ability to make structural guarantees about which functions depend on which fields
You might make this trade off using map keys like strings or keywords, but not if you use namespace qualified keywords like ::my-namespace/id, in combination with something like spec.alpha or malli, in which case you can easily make those structural guarantees in a way that is more expressive than an ordinary type system.
Spec & Malli look cool. But my concern is more with something like this (reusing my earlier example):
let mut authn = UserLoginView.build(userDataRepository);
let session = authn.login(user, pwd);
// vs
let session = userLogin(userDataMap);
In the first case, we know that `login` only has access to the fields in `UserLoginView`. In the second case, `userLogin` has access to every field in `userDataMap`. It's not simple to know how changes to other facets of the user entity will bleed across into logins. With `UserLoginView`, the separation is explicit, and the exchange between the general pool of user info and the specific view of it required for handling authorization is wrapped up in one factory method.
In the first case, it makes sense to unit test logins using every conceivable variation of `UserLoginView`s. In the second case, your surface area is much larger. `userDataMap` is full of details that are irrelevant to logins, so you only test the small relevant subset of user data variations. As the code ages and changes, it becomes harder and harder to assess at a glance whether your test data really represents all the test cases you need or not.
I worry that Clojure-style maps don't fix the problems pointed out by the article. In a codebase that passes around big dumb data objects representing important entities (incrementally processing them, updating fields, etc), the logic eventually gets tangled. Every function touches a wide assortment of fields, and your test data is full of details that are probably inconsequential but you can't tell without inspecting the functions. I don't see how Clojure solves this without its own UserLoginView-style abstraction.
To be clear, there's nothing wrong with your approach, and many people implement systems exactly the way you are describing in Clojure using Records (which are Java classes).
I prefer not to work this way though. The spec-driven alternative could be:
(require '[clojure.spec.alpha :as s])
(require '[clojure.spec.gen.alpha :as gen])
(s/def :user.login/email string?)
(s/def :user.login/password-hash string?)
(s/def :user.login/credentials
(s/keys :req [:user.login/email ;; spec's compose
:user.login/password-hash]))
(defn login [credentials]
;; DIFFERENCE: runtime validation
{:pre [(s/valid? :user.login/credentials credentials)]}
(authenticate (:user.login/email credentials)
(:user.login/password-hash credentials)))
(let [user-data {:user.login/email "user@example.com"
:user.login/password-hash "hash123"
:user/address "123 Main St"
:user/purchase-history []}]
;; DIFFERENCE: extra data ignored implicitly
(login user-data))
;; Can also pass a minimal map
(login {:user.login/email "user@example.com"
:user.login/password-hash "hash123"})
;; or you can generate the data (only possible because spec is a runtime construct)
(let [user-data
(gen/generate (s/gen :user.login/credentials)) ; evaluates to #:user.login{:email "cWC1t3", :password-hash "Ok85cHMP5Bhrd4Lzx"}
]
(login user-data))
The drawbacks of Records are the same for Objects - Records couple data structure to behavior (they're Java classes with methods), while spec separates validation from data, giving you:
You say "it makes sense to unit test logins using every conceivable variation of `UserLoginView`", well, with spec you can actually *do that*:
(require '[clojure.test.check.properties :as prop])
(require '[clojure.test.check.clojure-test :refer [defspec]])
(defspec login-always-returns-session 100
(prop/for-all [creds (s/gen :user.login/credentials)]
(let [result (login creds)]
(s/valid? :user.session/token result))))
This is impossible with Records/Objects - you can't generate arbitrary Record instances without custom generators.
- function instrumentation (with clojure.spec.test.alpha/instrument)
- automatic failure case minimization (with clojure.spec.alpha/explain + explain-data)
- data normalization / coercion (with clojure.spec.alpha/conform)
- easier refactoring - You can change specs without changing data structures
- serialization is free - maps already serialize, whereas you have to implement it with Records.
Plus you get to leverage the million other functions that already work on maps, because they are the fundamental data structure in Clojure.
You just don't have to create the intermediate record, let your data be data.
I really appreciate this comment. I have some passing familiarity with Clojure, but I don't understand it (and its motivations) fully and would like to understand it better. Thanks for the YouTube link; I'd love some reading material (essay-length preferred but anything will do) also, if you have any to recommend.
The absolute best resource I've found for educating myself about this topic is John Vervaeke's free online course "Awakening from the meaning crisis". You can search it in YouTube or Spotify.
He explains in detail exactly why a "nostalgic return to religion" cannot save us from, not just nihilism, but the entire set of crises western society is undergoing.
The crises stems not from a loss or lack of meaning, it's from recognizing how limited our forms like narratives and myths/religions provide access to meaning. If we fully recognize the meaning load in any event, it's endlessly connected to past and future events. Any event's local-load is likewise massive. The idea we use metaphors as meaning sinks is bizarre. Metaphors are arbitrary, meaning is not, it is specific. This is the inherent problem.
The scaffolding we use for meaning, language, myth, causality, narratives, these are all Pleistocene tools that have long overstyed their welcome. Access to meaning is a total failure of imagination of the basics.
I'm not disagreeing, but what alternatives are there? And to continue with the tool metaphor: How would we know if it's a better tool? Without a vantage point where we could judge both the tools we have now to the alternative, we might be just trading one flawed tool for another. But I'm not going throw away a flashlight because it doesn't light up the universe either. At least with a flashlight, I can see something.
If what we get to navigate with the flashlight eventually extincts us in folk meaning, then better upgrade the tool.
The problem with meaning is the problem with the words. Get rid of them and their agentic curse that lowballs meaning. There are glyphs, movies, Navajo, Hopi, Zuni, Chorti/Yucatec etc., Chinese, Japanese, Korean.
We use landfill for communication. Western languages are terrible hacks of sense-emotion-syntax. That got hacked in Gutenberg, ASCII, web now AI. It's dead.
Again, I'm sympathetic... but replacing one symbolic system for another doesn't answer the question of how any symbolic system relates to it's "meaning" or even in a more accurate way. We are always in the soup of language, change the seasoning, but we will always be in the soup.
Ditch them. Movies already demonstrate what Mayan/Chinese accomplishes momentarily. It's post-representation, post-symbolic, post-metaphor. verbs only, references only. Concatenation. Humans are slaves to symbols, look at computers. We have an S&M relationship to the arbitrary and Wall Street and Silicon Valley want it this way: they've made trillions off an illusion. We either shift to external references for action-syntax or enjoy the ride the dinosaurs took down. And they had a meteor. We've got a self-made mirage of the arbitrary.
This is only true if you ignore the role of America's greatest ally, in which case, the motivations become very obvious.