For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | names_r_hard's commentsregister

Nearly all Canons have a small access port as part of the battery door, which you can put a power supply cable / through, by design. Don't buy too cheap a dummy battery, the really cheap ones may have very bad voltage regulation. You can get ones designed to work from a USB power bank, or mains.


6D2 is a nice cam, and one I happen to own. This cam is under active development. Locally I have ~FHD raw video working.


The entire copyright and patent system is built on the principle of forcing the release of IP; it is time delayed in exchange for the legal protections you gain if you opt in to the system. That is the encouragement!

Extending this to enable software access by 3rd parties doesn't feel controversial to me. The core intent of copyright and patent seems to be "when the time limit expires, everyone should be able to use the IP". But in practice you often can't, where hardware with software is concerned.


They were trendy at the time :D

I think possibly someone thought it sounded a bit like firmware?


Canon's legal team have never said anything about Magic Lantern in any context that I'm aware of.

The high end cams need ML less, they have more features stock, plus devs need access to the cam to make a good port. So higher end cams tend to be less attractive to developers.


Of course they did. They explicitly said, that if ML team touches their high end cameras lawyer team gonna push their shit in.


No, they explicitly never said that. This is the origin:

https://www.canonrumors.com/inside-the-canon-eos-1d-c

Quote:

I was told by someone at Canon that they would “bring the might of its legal team” to anyone that attempts to modify at the software level, the features of an EOS-1 camera body

/Quote

"Someone"'s name was never revealed.

ML was never mentioned directly.*

ML team was never contacted by any Canon official or anyone speaking in their regard.

* Of course it is logical to assume there is a connection to ML.


I don't believe they did. Where did they say that?


I wouldn't recommend the 600D if you want to do video. For stills it's perfectly acceptable. Auto-focus will feel slow compared to a modern cam. If you're going for an old / budget cam, try and reach to the 650D or 700D, those are a newer generation of hardware.

200D is much newer, but less well supported by ML. I own this cam and am actively working on improving it. 200D has DPAF, which means considerably improved auto-focus, especially for video. Also it can run Doom.

Are there any ML features in particular you're interested in?


I'm interested in using all kinds of devices for streaming live video from live (think music) events via OBS with minimal effort. Current setup is with a device (just an old iPhone can do) which provides WiFi connectivity, then another (or the same device) runs DroidCAM which them streams into a nearby laptop with OBS (typically capturing the audio needed) and this is then sent to wherever we decide (twitch, RSTP, etc). We've tried this setup with as many as three droidcam phones, and it is just fine on... a legacy MacBook Pro with Intel + 1500MB Iris card.

So ideally I'd imagine getting a second-hand 600D or 200D and having a similar setup. We did have a setup (previously) where a GoPro or mini-HDMI campera is captured and then processed by a RBPi 2/3/4, but this seems an overkill compared to the DroidCAM Setup.

And, of course, the optics on the 600D/200D are expected to be much more correct than those on an iPhone or similar phone/mobile device.

Thanks for your kind attention.


With 600D you are stuck to 1620x912 in video mode embedded in 1080i59.94 8-bit. Black borders around and you have to crop and - maybe - scale up. 200D HDMI stream with ML is clean with MF but AF will still draw a focus rectangle. But at least true FHD via HDMI.

AF with 600D in liveview: Phase detection only. Focus hunting galore. 200D comes with usable DPAF.

I prefer 250D for streaming. Dual display support, no 30 minute limit for HDMI out (but cam display will go dark until some button action).


But there’s no way to skip HDMI right? I mean I’m not familiar with ML, but think whether some wiring can be spared. Don’t these cameras expose wifi of some sort?

IMHO even 800x600 is okay for most streaming needs. And particularly when sound quality is of primary importance.


600D doesn't have WiFi. You can stream via HDMI or USB. USB stream for cams not supporting UVC (in general every EOS before Digic X and even some of those) is limited to 576p. Don't get your hopes high subscribing to EOS Webcam Utility Pro. Canon says it is able to deliver 720p and 1080p but they are scaled up from 576p. And you need a dedicated mic. No audio support via USB or HDMI in early cams.


Thanks for taking time to explain. Perhaps I should also take time to share our setup in some sort of blog or gist.


I stand by my statement! Compare the length of the C standard to JS / ECMAScript, or C++! :)

Maaaaybe I'm hiding a tradeoff around complexity vs built-in features, but volunteers can work that out themselves later on.

You honestly don't need much knowledge of C to get started in some areas. The ML GUI is easy to modify if you stay within the lines. Other areas, e.g., porting a complex feature to a new camera, are much harder. But that's the life of a reverse engineer.


Conversely, the terseness of the C standard also means there's many more footguns and undefined behaviors. There are many things C is, but being easy to pick up is not one of them. I loved C all the way up until I graduated uni, but it would be a very hard sell to get me to pick it for a project these days. To me, working with C is akin to working with assembly, you just feel that you're doing real programming, but realistically there's better options for most scenarios these days.


I agree with some of what you're saying; some of the well known risks of working in C are because it's a small standard. But much of the undefined behaviour was deliberately made that way to support the hardware of the time - it's hard to be cross-platform on different architectures as a low-level language.

C genuinely is easy to pick up. It is harder to master. And you're right, for many domains, there are better options now, so it may not be worth while mastering it.

Because it's an old language, what it lacks in built-in safety features, is provided by decades of very good surrounding tooling. You do of course need to learn that tooling, and choose to use it!

In the context of Magic Lantern, C is the natural fit. We are working with very tight memory limitations, due to the OS. We support single core 200Mhz targets (ARMv5, no out-of-order or other fancy tricks). We don't include C stdlib, a small test binary can be < 1kB. Normal builds are around 400kB (this includes a full GUI, debug capabilities, all strings and assets, etc).

Canon code is probably mostly C, some C++. We have to call their code directly (casting reverse engineered addresses to function pointers, basically). We don't know what safety guarantees their code makes, or what the API is. Most of our work is interacting with OS or hardware. So we wouldn't gain much by using a safe language for our half.


> C genuinely is easy to pick up.

I feel like this is a bit of an https://xkcd.com/2501/ situation.

C is considered easy to pick up for the average user posting HN comments because we have the benefit of years -- the average comp sci student, who has been exposed to Javascript and Python, who might not know what "pass by reference" even means... I'm not sure they're going to be considering C easy.


I've taught several different languages to both 1st year uni students, and new joiners to a technical company, where they had no programming background.

Honestly, C seems to be one of the easier languages to teach the basics of. It's certainly easier than Java or C++, which have many more concepts.

C has some concepts that confuse the hell out of beginners, and it will let you shoot yourself in the foot very thoroughly with them (much more than say, Java). But you don't tend to encounter them till later on.

I have never said getting good at C is easy. Just that it's easy to pick up.


C made a lot more sense to me after having done assembly (6502 in my case, but it probably doesn't matter). Things like passing a reference suddenly just made sense.


I agree. For me as a beginner, C was relatively easy to learn the basics of. Sure, I never went on to get familiar with all the details and become proficient in it, but the basic concepts really aren’t that hard to understand. There’s just not too much you need to wrap your head around.


C is taught as the introduction to programming in CS50x, Harvard's wildly popular MOOC for teaching programming to first-year college students and lifelong learners via the internet. Using the clang toolchain gives you much better error messages than old versions of gcc used to give. And I bet AI/LLM/copilot tools are pretty good at C given how much F/OSS is written in C.

Just to provide another data point here... that C is a little easier to pick up, today, than it was in the 1990s or 2000s, when all you had was the K&R C book and a Linux shell. I regularly recommend CS50x to newcomers to programming via a guide I wrote up as a GitHub gist. I took the CS50x course myself in 2020 (just to refresh my own memory of C after years of not using it that much), and it is very high quality.

See this comment for more info:

https://news.ycombinator.com/item?id=40690760


Everything is passed by reference in Python. Everything is passed by value in C.


Not quite true for Python but a close approximation.


depends on which school you went? the one I've been to started with C and LISP in the 2010s and then moved on to C++ and java with some python


Undefined behaviors -- yes. But being able to trigger undefined behavior is not a huge foot gun by itself. Starting with good code examples means you are much less likely to trigger it.

Having a good, logical description of supported features, with a warning that if you do unsupported stuff things may break, is much more important than trying to define every possible action in a predictable way.

The latter approach often leads to explosion of spec volume and gives way more opportunities for writing bad code: predictable in execution, but instead with problems in design and logic which are harder to understand, maintain and fix. My 2c.


I stand by my statement! Compare the number of strings a violin has to the keys on a piano! :)


I know it's all at least semi- tongue-in-cheek, but IRL a piano's discrete, sequential keys are what make it almost inarguably the easiest instrument to learn.


That's exactly his point. Languages aren't easier to learn simply because their specification is short, any more than instruments are easier to play because they have fewer strings.


The analogy is completely invalid. Languages with small specifications are easier to learn.

It's sad that the dev, who has done great work, has to spend time defending the C language from critters living under a bridge when it's a fixed element that isn't going to change.


Accusing people who disagree w/ you of being trolls doesn't bolster your argument.


Speaking of weak arguments: that wasn't the basis of the accusation.


People don't argue with a carpenter over what tools were used to build a piece of furniture. It feels like a religious debate.


> Languages with small specifications are easier to learn.

Only if all other things are equal, which they never are.


It's not firmware :) We use what is probably engineering functionality, built into the OS, to load and execute a file from disk. We run as a (mostly) normal program on the cam's normal OS.

We build with: -Wall -Wextra -Werror-implicit-function-declaration -Wdouble-promotion -Winline -Wundef -Wno-unused-parameter -Wno-unused-function -Wno-format

Warnings are treated as errors for release builds.


Awesome!

Great work, and good luck!


Thanks, and for what it's worth, I didn't downvote you (account is too new to even do so :D ), and I agree with your main point - it's not that hard to avoid all compiler warnings if you do it from the start, and make sure it's highly visible.

You only add one at a time, so you only need to fix one at a time, and you understand what you're trying to do.

It is, however, a real bitch to fix all compiler warnings in decade old code that targets a set of undocumented hardware platforms with which you are unfamiliar. And you just updated the toolchain from gcc 5 to 12.


Oh, don't worry about the downvotes. Happens every time someone starts talking about improving software Quality around here.

Unpopular topic. I talk about it anyway, as it's one of my casus belli. I can afford the dings.

BTW: I used to work for Canon's main [photography] competitor, and Magic Lantern was an example of the kind of thing I wanted them to enable, but they were not particularly open to the idea -control freaks.

Also, it's a bit "nit-picky," I know, but I feel that any software that runs on-device is "firmware," and should be held to the same standards as the OS. I know that Magic Lantern has always been good. We used to hear customers telling us how good it was, and asking us to do similar.

I think RED had something like that, as well. I wonder how that's going?


Okay, good, just making sure :) Fun to hear that at least some photo gear places are aware of ML!

I have done a stint in QA, as well as highly aggressive security testing against a big C codebase, so I too care a lot about quality. And you can do it in C, you just have to put in the effort.

I'd like to get Valgrind or ASAN working with our code, but that's quite a big task on an RTOS. It would be more practical in Qemu, but still a lot of effort. The OS has multiple allocators, and we don't include stdlib.

Re firmware / software, doesn't all software run on a device? So I suppose it depends what you mean by a device. Is a Windows exe on a desktop PC firmware? Is an app from your phones store firmware? We support cams that are much more powerful than low end Android devices. Here the cam OS, which is on flash ROM, brings the hardware up, then loads our code from removable storage, which can even be a spinning rust drive. It feels like they're firmware, and we're software, to me. It's not a clearly defined term.

The main reason I make the distinction is because we get a lot of users who think ML is like a phone rom flash, because that's what firmware is to most people. Thus they assume it's a risky process, and that the Canon menus etc will be gone. But we don't work that way.


Good point, and really just semantics. I guess you could say native mobile apps are “firmware,” using my criteria.

But I put as much effort into my mobile apps, as I did, into my firmware projects (it’s been decades since I wrote firmware, BTW. The landscape is quite different, these days -This is my first ever shipped engineering project[0]. Back then, we could still use an ICE to debug our software).

It just taught me to be very circumspect about Quality.

I do feel that any software (in any part of the stack) I write that affects moving parts, needs to be quite well-tested. I never had issues with firmware, but drivers are another matter. I've fried stuff that cost a lot.

[0] https://littlegreenviper.com/TF30194/TF30194-Manual-1987.pdf


Yes, it gets a bit blurry, especially given how fast solid-state storage is these days.

I think IoT has seen a resurgence in firmware devs... but regrettably not so much in quality. Too cheap to be worth it, I suppose. I can imagine a microwave could be quite a concerning product to design - there's some fairly obvious risks there!

Certainly, whatever you class ML as, we could damage the hardware. The shutter in particular is quite vulnerable, and Canon has made an unusual design choice that it flashes an important rom with settings at every power off. Leaving these settings in an inconsistent state can prevent the cam from booting. We do try to think hard about contingencies, and program defensively. At least for anything we release. I've done some very stupid tests on my own cams, and only needed to recover with UART access once ;)

I haven't use ICE, but I have used SoftICE. Oh, and we had a breakthrough on locating JTAG pinouts very recently, so we might end up being able to do similar.


You do need to be careful with the shutter. It is possible to do damage (and add dirt) from it.

We had to add software dust removal, because the shutter kicked dirt onto the sensor.

I’m assuming that, at some point, the sensor technology will progress to where mechanical shutters are no longer necessary.


No time like the present :)

It is actually easier to get started now, as I spent several months updating the dev infrastructure so it all works on modern platforms with modern tooling.

Plus Ghidra exists now, which was a massive help for us.

We didn't really go on hiatus - the prior lead dev left the project, and the target hardware changed significantly. So everything slowed down. Now we are back to a more normal speed. Of course, we still need more devs; currently we have 3.


There's a fun step you're missing - it's not firmware. We toggle on (presumably) engineering functionality already present in Canon code, which allows for loading a file from card as an ARM binary.

We're a normal program, running on their OS, DryOS, a variant of uITRON.

This has the benefit that we never flash the OS, removing a source of risk.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You