Insights from building Ungoogled-Chromium on a modern desktop machine.

A re-occurring question is why chromium is slow to build.
Doing a build of Ungoogled-Chromium on a large enough Desktop PC.
Total build time 2 Hours and 3 minutes.

CPU Intel Core Ultra 9 285K 24 cores without hyperthreading.
GPU not relevant for this build-run : Nvidia RTXPRO 4000.
Memory 64 GB DDR5 5600 DUAL Bus ( 2x 32 GB )
Storage 1 TB M.2 Gen5 NVME.
Vendor: Shuttle SB860R8 Barebone chassie.

FreeBSD 15.1 Stable AMD64
Desktop Manager in use while doing build: KDE/Plasma 6.6.4
FileSystem OpenZFS

Memory Allocation during run.

ZFS ARC Cahche up to 30 GB ( not limited )
Active Memory: up to 28 GB
Inactive memory: up to 8.5 GB
Allocated SWAP: up to 500 MB. ( Increases as run progress. )

Resident sizes of up to 2.5 GB per compile task.

As the build consumes up to 2.5 GB resident memory per compile task/thread at its most hungry state.
And Memory is needed for the filesystem , if this build is attempted with 16 GB RAM only ,
you can run 4 or 5 compile threads and limit ZFS to a 5 GB Arc. looking at a completion time of maybe 8 - 10 hours.
With 32 GB Ram , Maybe 10 compile jobs and 10 GB ZFS ARC running for 5 Hours.

A Machine with 16 GB and default 2 GB SWAP. will run out of Virtual memory, unless its restricted in number of build tasks and
Filesystem cache (ZFS ARC) usage.
 
Me having 12 cores , 64GB , takes me about 33 hours.
One hour on redcore-linux, but that is due to 1000's of patches needed for freebsd & ninja.
No i go for the binary ...
 
metric presented seems within reason. Given the complexity of modern browsers I'd probly disable all parallelism in the build and expect the machine to chug away for hours/days/weeks. intelligent rebuilds seem to be non-existent in modern complex projects. Seems like "make clean; make all" is about all anyone ever does anymore. The reasons why open up a whole other topic of discussion.
 
I built it now in just a bit over 4 hrs while working at the computer too, not much but browsing and some text editing. It is a horrible piece of software with 2GB source code...
 
So the problem of building anything still comes down to "it takes too long". The simple fact of being able to build Chromium on a system, with a GUI and the GUI remains relatively responsive, is incredible to anyone that has had to submit batch builds of Ada on a Vax and "came back tommorrow" to find out the build failed.

Parallel builds are good, just like multiprocessing you need to pay attention to the rendezvous and what should happen if a thread fails. Lots depends on the "makefiles" being correct.
 
Back
Top