[go: up one dir, main page]

|
|
Log in / Subscribe / Register

The complexity has merely moved

The complexity has merely moved

Posted Oct 3, 2024 16:34 UTC (Thu) by laurent.pinchart (subscriber, #71290)
In reply to: The complexity has merely moved by atnot
Parent article: Coping with complex cameras

> So it's not that cameras have gotten more complex. They're still doing exactly the same thing. It's just that where that code runs has changed and existing interfaces are not equipped to deal with that.

I think it's a bit more complicated than this. With the processing moving to the main SoC, it became possible for the main OS to have more control over that processing. I think this has led to more complex processing being implemented, that would have been more difficult (or more costly) to do if everything had remained on the camera module side. Advanced HDR processing is one such feature for instance, and NPU-assisted algorithms such as automatic white balance is another example. Some of this would probably still have happened without the processing shifting to the main SoC, but probably at a different pace and possibly in a different direction.

In any case, the situation we're facing is that cameras now look much more complex from a Linux point of view, due to a combination of processing moving from the camera module to the main SoC, and processing itself getting more complex (regardless of whether or not the latter is a partial consequence of the former).


to post comments

The complexity has merely moved

Posted Oct 3, 2024 17:37 UTC (Thu) by intelfx (subscriber, #130118) [Link] (4 responses)

> due to a combination of processing moving from the camera module to the main SoC, and processing itself getting more complex (regardless of whether or not the latter is a partial consequence of the former).

Forgive me if I'm wrong, but I always thought it was mostly the other way around, no? I. e. due to R&D advances and market pressure the image processing started to get increasingly more complex (enter computational photography), and at some point during that process it became apparent that it's just all around better to get rid of the separate processing elements in the camera unit and push things into the SoC instead.

Thus, a few years later, this trend is finally trickling down to general-purpose computers and thus Linux proper.

Or am I misunderstanding how things happened?

The complexity has merely moved

Posted Oct 3, 2024 18:49 UTC (Thu) by laurent.pinchart (subscriber, #71290) [Link] (1 responses)

I think it goes both ways. I can't tell exactly when this transition started and what was the trigger, but once processing moved to the main SoC, it opened the door to more possibilities that people may not have thought of otherwise. In turn, that probably justified the transition, accelerating the move.

There are financial arguments too, in theory, at least if development costs and the economical impact on the users are ignored, this architecture is supposed to be cheaper. A more detailed historical study would be interesting. Of course it won't change the situation we're facing today.

The complexity has merely moved

Posted Oct 4, 2024 19:00 UTC (Fri) by Wol (subscriber, #4433) [Link]

> I think it goes both ways. I can't tell exactly when this transition started and what was the trigger, but once processing moved to the main SoC, it opened the door to more possibilities that people may not have thought of otherwise. In turn, that probably justified the transition, accelerating the move.

This seems to be pretty common across a lot of technology. It starts with general purpose hardware, moves to special-purpose hardware because it's faster/cheaper, the special purpose hardware becomes obsolete / ossified, it moves back to general purpose hardware, rinse and repeat. Just look at routers or modems ...

Cheers,
Wol

The complexity has merely moved

Posted Oct 3, 2024 19:55 UTC (Thu) by excors (subscriber, #95769) [Link] (1 responses)

From what I vaguely remember, back when Android only supported Camera HAL1 (which used the old-fashioned webcam model: the application calls setParameters(), startPreview(), takePicture(), and eventually gets the JPEG in a callback and can start the process again), there was already a trend to put the ISP on the main SoC. It's almost always cheaper to have fewer chips on your PCB, plus you can use the same ISP hardware for both front and rear cameras (just reconfigure the firmware when switching between them), and you can use the same RAM for the ISP and for games (since you're not going to run both at the same time), etc, so there are significant cost and power-efficiency benefits.

The early SoC ISPs were quite slow and/or bad so they'd more likely be used for the front camera, while the higher-quality rear camera might have a discrete ISP, but eventually the SoCs got good enough to replace discrete ISPs on at least the lower-end phones. (And when they did have a discrete ISP it would probably be independent of the camera sensor, because the phone vendor wants the flexibility to pick the best sensor and the best ISP for their requirements, not have them tied together into a single module by the camera vendor.)

Meanwhile phone vendors added proprietary extensions to HAL1 to support more sophisticated camera features (burst shot, zero shutter lag, HDR, etc), because cameras were a big selling point and a great way to differentiate from competitors. They could implement that in their app/HAL/drivers/firmware however they wanted (often quite hackily), whether it was an integrated or discrete ISP.

Then Google developed Camera HAL2/HAL3 (based around a pipeline of synchronised frame requests/results) to support those features properly, putting much more control on the application side, standardising a lower-level interface to the hardware. I'm guessing that was harder to implement on some discrete ISPs that weren't flexible enough, whereas integrated ones typically depended more heavily on the CPU so they were already more flexible. It also reduced the demand for fancy new features on ISP hardware since the application could now take responsibility for them (using CPU/GPU/NPU to manipulate the images efficiently enough, and using new algorithms without the many-year lag it takes to implement a new feature in dedicated hardware).

It was a slow transition though - Android still supported HAL1 in new devices for many years after that, because phone vendors kept using cameras that wouldn't or couldn't support HAL3.

So, I don't think there's a straightforward casuality or a linear progression; it's a few different things happening in parallel, spread out inconsistently across more than a decade, applying pressure towards the current architecture. And (from my perspective) that was all driven by Android, and now Linux is having to catch up so it can run on the same hardware.

The complexity has merely moved

Posted Oct 3, 2024 20:11 UTC (Thu) by laurent.pinchart (subscriber, #71290) [Link]

> And (from my perspective) that was all driven by Android, and now Linux is having to catch up so it can run on the same hardware.

It started before Android as far as I know. Back in the Nokia days, the FrakenCamera project (https://graphics.stanford.edu/papers/fcam/) developed an implementation of a computational photography on a Nokia N900 phone (running Linux). A Finnish student at Stanford university participated in the project and later joined Google, where he participated in the design of the HAL3 API based on the FCam design.


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds