tech-pkg archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: bulk builds incompatible with changes



Hi,

If the scan phase has completed, you can:

1. interrupt whatever's running
2. mv bulklog bulklog.old
3. update tree
4. run bulkbuild-restart

It will reuse scan results from bulklog.old if possible -- only
rescans the packages affected by files that have changed.

(Of course, if bsd.pkg.mk changed then it rescans everything, but
that's part of why, e.g., we don't do infrastructure pullups on
release branches.)

This is generally what I do already. Slower architectures don't run their own scans, anyway (not for a lack of trying, but ).

What I'm really asking is how do we programmatically know when packages need to be rebuilt because of updates to dependencies?

For instance, perl just got updated to 5.40.2 in 2025Q1, and most perl packages probably don't need to be updated, but what if something like Rust were updated?

Right now, I rescan using either an amd64 or an aarch64 system, then share the meta/ files with the other architectures.

Jonathan Perkin wrote here:

https://mail-index.netbsd.org/tech-pkg/2025/03/20/msg030705.html

asking what kind of tooling improvements we'd like to see. One thing I'd find amazingly useful is a continuously updating scanning mechanism. With a pbulk scan, we can get a list of packages to be built, and while this is great for quarterly builds, it's not practical to rescan for every package update.

But what if we had a tool that runs whenever a package is updated that updates that list? We could then use simple tooling to queue up builds for various archs and have a continuously updating set of binaries tracking pkgsrc.

Just some thoughts...

John



Home | Main Index | Thread Index | Old Index