Date   

Re: [meta-rockchip] defconfig alternatives

Yann Dirson
 

= "Hi Trevor,

Le mer. 24 mars 2021 à 01:41, Trevor Woerner <twoerner@...> a écrit :

On Tue 2021-03-23 @ 12:59:24 PM, Yann Dirson wrote:
Hi Trevor,

Le lun. 22 mars 2021 à 16:50, Trevor Woerner <twoerner@...> a écrit :
BTW, I'm also unclear on what to do next to better support those
boards: with the default
kernel config only a subset of the hardware is supported, and for
state-of-the-art hw
support we'll also need patches not yet in upstream kernel (from eg.
armbian and libreelec).

I feel it would be good to provide defconfig files for those machines,
but then there are
several options to handle that. Would a minimal hw-focused defconfig
suitable for
`KCONFIG_MODE = "--allnoconfig"` be a good option ?
I feel exactly the same way.

By default all arm64 kernels are configured with the one, in-kernel, generic
arm64 defconfig. That gives me a kernel that is over 11MB in size, and
includes all sorts of useless drivers.

I've been working off-and-on on a mechanism for meta-rockchip that would allow
users to decide between the default in-kernel arm64 defconfig (which would
be selected by doing nothing) or using a leaner defconfig that I have been
tweaking specifically for each board. Currently I only have a lean defconfig
for rock-pi-4b, but it was my hope to generate defconfigs for all supported
boards.

Ideally I had wanted to leverage the linux-yocto kmeta mechanism to generate
defconfigs dynamically based on the specific machine and specific user
preferences, but that didn't go as smoothly as I was hoping, then I got
distracted by other things.

I had created a spreadsheet with a comparison between the various boards that
would have been a basis for the individual kmeta pieces. Maybe I'll find some
more time to poke at it later this week. I could also push my WIP stuff to
somewhere if you'd like to take a look.

In any case, my point is, I'm very interested in something better than what
currently exists :-)
On my side I have a minimal defconfig for our own board, which is very similar
to the nanopi-m4, which could be used as a starting point for the latter.


One thing that I'd like to keep clear in meta-rockchip is to always allow the
user to choose between upstream and "extras". My feeling is: the simplest
build, if the user does nothing explicit, will always pull from pure upstream
with no out-of-tree patches or vendor pieces. But I'm not opposed to having
a mechanism whereby if the user does something explicit, they can choose to
use a vendor tree or make use of out-of-tree patches for various things.
One possibility would be using a KERNEL_CONFIG_VARIANT variable, whose
values would select consistent sets of KBUILD_DEFCONFIG + KCONFIG_MODE
+ SRC_URI_append. Standard variants could include "mainline" as the
default, and
maybe "customhw" which would bring just the hw features for the board
in allnoconfig
mode.

Or maybe we could try to fit such a selection mechanism in the PACKAGECONFIG
system, but I'm not sure it would really fit.
The above (if I'm reading it correctly) sounds quite similar to something I
had already started a while back. So I'll go ahead and publish this
work-in-progress. Maybe if I'm lucky it might spark some conversation with
other BSP maintainers.

https://github.com/twoerner/meta-rockchip__twoerner/tree/rockchip-kernel-config-WIP

Here is the text I've added to the README, which I think helps clarify some of
my points:

Kernel configuration:
--------------------
When it comes to configuring the kernel, allow the user to choose between:
1. using the in-kernel defconfig
2. using an in-layer defconfig + config fragments

The in-kernel defconfig is a very generic configuration meant to build a
kernel that could (theoretically) be run on a wide variety of devices of
the same architecture. I.e. a kernel built for one aarch64 machine (e.g.
the Qualcomm-based DragonBoard 410c) could be used without modification on
a completely different aarch64 machine (e.g. an Amlogic-based Odroid-C4). As
you can imagine, the in-kernel configuration generates a very large kernel.
Currently the in-kernel defconfig produces a kernel that is roughly 12MB.

The in-layer defconfig + config fragments is meant to trim down the kernel
configuration to remove all the hardware settings that aren't relevant to the
specific MACHINE being built. I.e. a kernel built for the rock-pi-4b wouldn't
include, for example, Qualcomm-specific drivers or code.

Currently, option #2 is only available for the following MACHINE(s):
- rock-pi-4b

The user indicates their intent via the RK_KERNEL_CONFIG_TYPE variable. If
the user does nothing, the default behaviour is to use the in-kernel
defconfig. If the user sets
RK_KERNEL_CONFIG_TYPE = "inlayer"
then the in-layer defconfig + config fragments will be used.

At this point I don't have everything that I'm wishing for. I had started to
try to add everything that I've wanted, but it wasn't working, so I pulled
back and only committed the parts that I was able to get working.

Right now the user can toggle between the generic in-kernel defconfig, or a
leaner defconfig that I've defined by playing with the RK_KERNEL_CONFIG_TYPE
variable (in local.conf, for example). Right now I've only done that for the
rock-pi-4b, but ideally I'd add others as time goes on.

I think it'll always be good to allow users to choose between the in-kernel
defconfig and something custom. We'll always want to be able to say "does it
work with the in-kernel defconfig?".

But better yet, instead of one big monolithic defconfig per board, ideally the
meta-rockchip BSP layer would contain a whole bunch of little kernel config
fragments for turning on just one thing. For example, there would be a kernel
config fragment for turning on basic Rockchip support, another one to enable
the RK808 pmic, and another one for the RK805 pmic. Others config fragments
would enable various ethernet options, wifi, bluetooth, etc. One would enable
the ES8388 audio codec (found on the rock2-square) and another would enable
just the ES8316 audio codec (the one found on the rock-pi-4).

Then, various parts on the configuration would enable the relevant kernel
config fragments. Simply selecting, for example, rock-pi-e, would include
the include/rk3328.inc, which would pull in basic rockchip/rk3328 support
and some other default things. The rock-pi-e.conf would pull in the correct
networking/bt options, and select the RK805 pmic. Eventually all the little
fragments would be pulled in that would be necessary to generate the whole
defconfig for this board.

That's the dream, anyway :-/
That sound fine :)

I think we can even do something like this with just standard-looking
overrides and no
specific anonymous python. I'm thinking of something like (including
non-arm things, after all
there's no reason to reserve such a mechanism to the arm/rk world):

# how the kernel defconfigs are named
KBUILD_DEFCONFIG_inkernel = "defconfig"
KBUILD_DEFCONFIG_inkernel_x86-64 = "x86_64_defconfig"
# how the layer defconfigs are named
KBUILD_DEFCONFIG_inlayer = "defconfig"

RK_KERNEL_CONFIG_TYPE = "inlayer"

KBUILD_DEFCONFIG = "${KBUILD_DEFCONFIG_${RK_KERNEL_CONFIG_TYPE}}"

RK_KERNEL_CONFIG_URIS_inkernel = ""
RK_KERNEL_CONFIG_URIS_inlayer = "file://defconfig file://soc.cfg
file://board.cfg"

SRC_URI_append = "${RK_KERNEL_CONFIG_URIS_${RK_KERNEL_CONFIG_TYPE}}"


Then we could have in the recipe files:
- a single defconfig for all rockchip
- per-soc, eg. rk3399/soc.cfg
- per-machine, eg. nanopi-m4/board.cfg

How does that sound ?


Technically, this information could be gleaned from the device tree for this
board… :-S

Then we'll need to take a look at all the DT overlays to see how to
incorporate them as well. Most of these boards have the "Raspberry Pi" 40-pin
interface, so users will expect to be able to reconfigure the pins for the
various alternate devices.
--
Yann Dirson <yann@...>
Blade / Shadow -- http://shadow.tech


[yocto-autobuilder-helper][dunfell V2 15/15] config: build and test SDKs when using package_deb

Steve Sakoman
 

From: Ross Burton <ross@...>

Signed-off-by: Ross Burton <ross.burton@...>
Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit b38664d2db940d2ef3238fdf0f2353162e120681)
Signed-off-by: Steve Sakoman <steve@...>
---
config.json | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/config.json b/config.json
index 0a5068a..e77a8fe 100644
--- a/config.json
+++ b/config.json
@@ -559,8 +559,8 @@
"pkgman-deb-non-deb" : {
"MACHINE" : "qemux86",
"PACKAGE_CLASSES" : "package_deb",
- "BBTARGETS" : "core-image-sato core-image-sato-dev core-image-sato-sdk core-image-minimal core-image-minimal-dev",
- "SANITYTARGETS" : "core-image-minimal:do_testimage core-image-sato:do_testimage core-image-sato-sdk:do_testimage"
+ "BBTARGETS" : "core-image-sato core-image-sato-dev core-image-sato-sdk core-image-minimal core-image-minimal-dev core-image-sato:do_populate_sdk",
+ "SANITYTARGETS" : "core-image-minimal:do_testimage core-image-sato:do_testimage core-image-sato-sdk:do_testimage core-image-sato:do_testsdk"
},
"pkgman-non-rpm" : {
"MACHINE" : "qemux86",
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 14/15] config.json: Split reproduciblity tests into their own target

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 97e0979d6eb0300951445bb2cc5eda315681302e)
Signed-off-by: Steve Sakoman <steve@...>
---
config.json | 30 +++++++++++++++++++++++++++++-
1 file changed, 29 insertions(+), 1 deletion(-)

diff --git a/config.json b/config.json
index cedcef7..0a5068a 100644
--- a/config.json
+++ b/config.json
@@ -171,7 +171,7 @@
},
"step2" : {
"shortname" : "OE Selftest",
- "EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; OEQA_DEBUGGING_SAVED_OUTPUT=${BASE_SHAREDDIR}/pub/repro-fail/ DISPLAY=:1 oe-selftest --skip-tests distrodata.Distrodata.test_checkpkg buildoptions.SourceMirroring.test_yocto_source_mirror devtool.DevtoolAddTests.test_devtool_add_npm recipetool.RecipetoolTests.test_recipetool_create_npm -T machine -T toolchain-user -T toolchain-system -j 15"],
+ "EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; OEQA_DEBUGGING_SAVED_OUTPUT=${BASE_SHAREDDIR}/pub/repro-fail/ DISPLAY=:1 oe-selftest --skip-tests distrodata.Distrodata.test_checkpkg buildoptions.SourceMirroring.test_yocto_source_mirror devtool.DevtoolAddTests.test_devtool_add_npm recipetool.RecipetoolTests.test_recipetool_create_npm reproducible -T machine -T toolchain-user -T toolchain-system -j 15"],
"ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
},
"step3" : {
@@ -179,6 +179,16 @@
"EXTRACMDS" : ["if [ `which oe-pylint` ]; then mkdir -p ${HELPERRESULTSDIR}/${HELPERTARGET}; oe-pylint > ${HELPERRESULTSDIR}/${HELPERTARGET}/pylint.log || true; fi"]
}
},
+ "reproducible" : {
+ "MACHINE" : "qemux86-64",
+ "SDKMACHINE" : "x86_64",
+ "step1" : {
+ "shortname" : "Reproducible Selftest",
+ "EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; OEQA_DEBUGGING_SAVED_OUTPUT=${BASE_SHAREDDIR}/pub/repro-fail/ DISPLAY=:1 oe-selftest -r reproducible -j 1"],
+ "ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
+
+ }
+ },
"trigger-build" : {
"SDKMACHINE" : "x86_64",
"MACHINE" : "qemux86",
@@ -725,6 +735,24 @@
"oe-selftest-centos" : {
"TEMPLATE" : "selftest"
},
+ "reproducible" : {
+ "TEMPLATE" : "reproducible"
+ },
+ "reproducible-ubuntu" : {
+ "TEMPLATE" : "reproducible"
+ },
+ "reproducible-debian" : {
+ "TEMPLATE" : "reproducible"
+ },
+ "reproducible-fedora" : {
+ "TEMPLATE" : "reproducible"
+ },
+ "reproducible-opensuse" : {
+ "TEMPLATE" : "reproducible"
+ },
+ "reproducible-centos" : {
+ "TEMPLATE" : "reproducible"
+ },
"check-layer" : {
"NEEDREPOS" : ["poky", "meta-gplv2", "meta-mingw"],
"step1" : {
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 13/15] scripts/run-config: Disable output buffering

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Line buffering (bufsize=1) is unavailable with binary mode so use unbuffered
mode instead. This fixes python runtime warnings.

[YOCTO #14093]

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit c21732937c89f7b13a4f8a9a02d7fcb15a4bad2d)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/scripts/run-config b/scripts/run-config
index aab52c1..8ed88cf 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -193,7 +193,7 @@ def bitbakecmd(builddir, cmd, report, stepnum, stepname, oeenv=True):

flush()

- with subprocess.Popen(cmd, shell=True, cwd=builddir + "/..", stdout=subprocess.PIPE, stderr=subprocess.STDOUT, bufsize=1) as p, open(log, 'ab') as f:
+ with subprocess.Popen(cmd, shell=True, cwd=builddir + "/..", stdout=subprocess.PIPE, stderr=subprocess.STDOUT, bufsize=0) as p, open(log, 'ab') as f:
for line in p.stdout:
writelog(line, f, sys.stdout.buffer)
sys.stdout.flush()
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 12/15] config.json: Add further descriptions

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 19b7456b92c2eb7b2b27f1e378dbc793d068ee3c)
Signed-off-by: Steve Sakoman <steve@...>
---
config.json | 29 +++++++++++++++++++++++++++++
1 file changed, 29 insertions(+)

diff --git a/config.json b/config.json
index ddf36ae..cedcef7 100644
--- a/config.json
+++ b/config.json
@@ -71,6 +71,7 @@
"SANITYTARGETS" : "core-image-sato:do_testsdk core-image-minimal:do_testsdkext core-image-sato:do_testsdkext"
},
"step3" : {
+ "shortname" : "Machine oe-selftest",
"BUILDHISTORY" : false,
"EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; DISPLAY=:1 oe-selftest ${HELPERSTMACHTARGS} -j 15"],
"ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
@@ -128,6 +129,7 @@
"SANITYTARGETS" : "core-image-minimal:do_testimage core-image-sato:do_testimage core-image-sato-sdk:do_testimage core-image-sato:do_testsdk core-image-sato:do_testsdkext"
},
"step2" : {
+ "shortname" : "Machine oe-selftest",
"BUILDHISTORY" : false,
"EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; DISPLAY=:1 oe-selftest -a -t machine -j 15"]
}
@@ -164,13 +166,16 @@
"RPM_GPG_SIGN_CHUNK = '1'"
],
"step1" : {
+ "shortname" : "Bitbake Selftest",
"EXTRACMDS" : ["bitbake-selftest"]
},
"step2" : {
+ "shortname" : "OE Selftest",
"EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; OEQA_DEBUGGING_SAVED_OUTPUT=${BASE_SHAREDDIR}/pub/repro-fail/ DISPLAY=:1 oe-selftest --skip-tests distrodata.Distrodata.test_checkpkg buildoptions.SourceMirroring.test_yocto_source_mirror devtool.DevtoolAddTests.test_devtool_add_npm recipetool.RecipetoolTests.test_recipetool_create_npm -T machine -T toolchain-user -T toolchain-system -j 15"],
"ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
},
"step3" : {
+ "shortname" : "Python Linter Report",
"EXTRACMDS" : ["if [ `which oe-pylint` ]; then mkdir -p ${HELPERRESULTSDIR}/${HELPERTARGET}; oe-pylint > ${HELPERRESULTSDIR}/${HELPERTARGET}/pylint.log || true; fi"]
}
},
@@ -178,6 +183,7 @@
"SDKMACHINE" : "x86_64",
"MACHINE" : "qemux86",
"step1" : {
+ "shortname" : "Sources pre-fetching",
"BBTARGETS" : "universe -c fetch -k",
"ADDLAYER" : ["${BUILDDIR}/../meta-selftest"],
"extravars" : [
@@ -189,6 +195,7 @@
"SDKMACHINE" : "x86_64",
"MACHINE" : "qemux86",
"step1" : {
+ "shortname" : "Source Mirror Selftest",
"EXTRACMDS" : ["${SCRIPTSDIR}/checkvnc; DISPLAY=:1 oe-selftest -r buildoptions.SourceMirroring.test_yocto_source_mirror"],
"ADDLAYER" : ["${BUILDDIR}/../meta-selftest"]
}
@@ -571,6 +578,7 @@
],
"step1" : {
"MACHINE" : "qemux86",
+ "shortname" : "qemux86 wic",
"BBTARGETS" : "wic-tools core-image-sato",
"EXTRACMDS" : [
"wic create directdisk -e core-image-sato -o ${BUILDDIR}/tmp/deploy/wic_images/qemux86/directdisk/core-image-sato/",
@@ -580,6 +588,7 @@
},
"step2" : {
"MACHINE" : "genericx86",
+ "shortname" : "genericx86 wic",
"BBTARGETS" : "wic-tools core-image-sato",
"EXTRACMDS" : [
"wic create directdisk -e core-image-sato -o ${BUILDDIR}/tmp/deploy/wic_images/genericx86/directdisk/core-image-sato/",
@@ -589,6 +598,7 @@
},
"step3" : {
"MACHINE" : "qemux86-64",
+ "shortname" : "qemux86-64 wic",
"BBTARGETS" : "wic-tools core-image-sato",
"EXTRACMDS" : [
"wic create directdisk -e core-image-sato -o ${BUILDDIR}/tmp/deploy/wic_images/qemux86-64/directdisk/core-image-sato/",
@@ -598,6 +608,7 @@
},
"step4" : {
"MACHINE" : "genericx86-64",
+ "shortname" : "genericx86-64 wic",
"BBTARGETS" : "wic-tools core-image-sato",
"EXTRACMDS" : [
"wic create directdisk -e core-image-sato -o ${BUILDDIR}/tmp/deploy/wic_images/genericx86-64/directdisk/core-image-sato/",
@@ -613,14 +624,17 @@
],
"step1" : {
"SDKMACHINE" : "x86_64",
+ "shortname" : "x86_64 tools",
"BBTARGETS" : "buildtools-tarball buildtools-extended-tarball uninative-tarball"
},
"step2" : {
"SDKMACHINE" : "i686",
+ "shortname" : "i686 tools",
"BBTARGETS" : "uninative-tarball"
},
"step3" : {
"SDKMACHINE" : "aarch64",
+ "shortname" : "aarch64 tools",
"BBTARGETS" : "buildtools-tarball buildtools-extended-tarball uninative-tarball"
}
},
@@ -635,9 +649,11 @@
"SOURCE_MIRROR_URL = 'file://${BASE_SHAREDDIR}/current_sources'"
],
"step1" : {
+ "shortname" : "Universe fetch",
"BBTARGETS" : "universe -k -c fetch"
},
"step2" : {
+ "shortname" : "BA image build",
"BBTARGETS" : "build-appliance-image"
}
},
@@ -766,6 +782,7 @@
"qa-extras" : {
"MACHINE" : "qemux86-64",
"step1" : {
+ "shortname" : "Readonly rootfs",
"BBTARGETS" : "core-image-minimal",
"SANITYTARGETS" : "core-image-minimal:do_testimage",
"extravars" : [
@@ -773,6 +790,7 @@
]
},
"step2" : {
+ "shortname" : "ROOT_HOME testing",
"BBTARGETS" : "core-image-minimal",
"SANITYTARGETS" : "core-image-minimal:do_testimage",
"extravars" : [
@@ -780,6 +798,7 @@
]
},
"step3" : {
+ "shortname" : "Full eSDK type",
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-minimal:do_populate_sdk_ext",
"extravars" : [
@@ -787,15 +806,18 @@
]
},
"step4" : {
+ "shortname" : "Prep locked-sigs test",
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-sato core-image-sato:do_populate_sdk_ext"
},
"step5" : {
+ "shortname" : "Prep #2 locked-sigs test",
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-sato -S none",
"EXTRACMDS" : ["${SCRIPTSDIR}/../janitor/clobberdir ${BUILDDIR}/../build/tmp"]
},
"step6" : {
+ "shortname" : "Test locked-sigs image",
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-sato",
"extravars" : [
@@ -804,6 +826,7 @@
]
},
"step7" : {
+ "shortname" : "Test locked-sigs eSDK",
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-sato:do_populate_sdk_ext",
"extravars" : [
@@ -814,6 +837,7 @@
"qa-extras2" : {
"MACHINE" : "qemux86-64",
"step1" : {
+ "shortname" : "Test logrotate",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -830,6 +854,7 @@
]
},
"step3" : {
+ "shortname" : "Test skeletoninit",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -839,6 +864,7 @@
"ADDLAYER" : ["${BUILDDIR}/../meta-skeleton"]
},
"step4" : {
+ "shortname" : "Systemd with sysvinit compat",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -848,6 +874,7 @@
]
},
"step5" : {
+ "shortname" : "Sysvinit with systemd",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -856,6 +883,7 @@
]
},
"step6" : {
+ "shortname" : "Systemd",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -866,6 +894,7 @@
]
},
"step7" : {
+ "shortname" : "Mesa gallium-llvm",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 11/15] config.json: Unbreak qa-extras locked sigs test

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

qa-extras and qa-extras2 were split incorrectly, fix this.

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 80fe0ab06972c46c82cde29cbdfcdac6e87dde99)
Signed-off-by: Steve Sakoman <steve@...>
---
config.json | 26 +++++++++++++-------------
1 file changed, 13 insertions(+), 13 deletions(-)

diff --git a/config.json b/config.json
index 4856507..ddf36ae 100644
--- a/config.json
+++ b/config.json
@@ -802,18 +802,18 @@
"TMPDIR = '${TOPDIR}/newtmp'",
"require ../locked-sigs.inc"
]
- }
- },
- "qa-extras2" : {
- "MACHINE" : "qemux86-64",
- "step1" : {
+ },
+ "step7" : {
"SDKMACHINE" : "x86_64",
"BBTARGETS" : "core-image-sato:do_populate_sdk_ext",
"extravars" : [
"TMPDIR = '${TOPDIR}/sdktmp'"
]
- },
- "step2" : {
+ }
+ },
+ "qa-extras2" : {
+ "MACHINE" : "qemux86-64",
+ "step1" : {
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -821,7 +821,7 @@
"TEST_SUITES_append = ' logrotate'"
]
},
- "step3" : {
+ "step2" : {
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -829,7 +829,7 @@
"TEST_SUITES_append = ' pam'"
]
},
- "step4" : {
+ "step3" : {
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -838,7 +838,7 @@
],
"ADDLAYER" : ["${BUILDDIR}/../meta-skeleton"]
},
- "step5" : {
+ "step4" : {
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -847,7 +847,7 @@
"TEST_SUITES_append = ' systemd'"
]
},
- "step6" : {
+ "step5" : {
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -855,7 +855,7 @@
"VIRTUAL-RUNTIME_init_manager = 'sysvinit'"
]
},
- "step7" : {
+ "step6" : {
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -865,7 +865,7 @@
"DISTRO_FEATURES_BACKFILL_CONSIDERED = 'sysvinit'"
]
},
- "step8" : {
+ "step7" : {
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 10/15] config.json/run-config: Add support for shortnames and descriptions

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Learn from the previous experiments and add meaninful shortnames and
descriptions to work around the 50 char name limit.

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit bceb63fb7952c6ed289733471a0177cfbc365a1e)
Signed-off-by: Steve Sakoman <steve@...>
---
config.json | 16 +++++++++++-----
scripts/run-config | 34 +++++++++++++++++++++++-----------
2 files changed, 34 insertions(+), 16 deletions(-)

diff --git a/config.json b/config.json
index 1fcc45d..4856507 100644
--- a/config.json
+++ b/config.json
@@ -453,7 +453,8 @@
"MACHINE" : "qemux86-64",
"SDKMACHINE" : "x86_64",
"step1" : {
- "description" : "x86_64 32bit multilib image with rpm",
+ "shortname" : "x86-64 lib32-img rpm",
+ "description" : "qemux86-64 32bit multilib image with rpm",
"BBTARGETS" : "lib32-core-image-minimal",
"SANITYTARGETS" : "lib32-core-image-minimal:do_testimage",
"extravars" : [
@@ -464,7 +465,8 @@
]
},
"step2" : {
- "description" : "x86_64 32bit multilib image with ipk",
+ "shortname" : "x86-64 lib32-img ipk",
+ "description" : "qemux86-64 32bit multilib image with ipk",
"PACKAGE_CLASSES" : "package_ipk",
"BBTARGETS" : "lib32-core-image-minimal",
"SANITYTARGETS" : "lib32-core-image-minimal:do_testimage",
@@ -476,7 +478,8 @@
]
},
"step3" : {
- "description" : "x86_64 64bit image and 32 bit multilibs with rpm",
+ "shortname" : "x86-64 lib32 rpm",
+ "description" : "qemux86-64 64bit image and 32 bit multilibs with rpm",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -489,7 +492,8 @@
]
},
"step4" : {
- "description" : "x86_64 64bit image and 32 bit multilibs with ipk",
+ "shortname" : "x86-64 lib32 ipk",
+ "description" : "qemux86-64 64bit image and 32 bit multilibs with ipk",
"PACKAGE_CLASSES" : "package_ipk",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
@@ -503,6 +507,7 @@
]
},
"step5" : {
+ "shortname" : "x86-64 lib64-img",
"description" : "x86 building 64bit multilib image",
"MACHINE" : "qemux86",
"SDKMACHINE" : "i686",
@@ -514,7 +519,8 @@
]
},
"step6" : {
- "description" : "mips64 image using n32 as default",
+ "shortname" : "mip64 n32",
+ "description" : "qemumips64 image using n32 as default",
"MACHINE" : "qemumips64",
"BBTARGETS" : "core-image-minimal core-image-minimal:do_populate_sdk",
"SANITYTARGETS" : "core-image-minimal:do_testimage core-image-minimal:do_testsdk",
diff --git a/scripts/run-config b/scripts/run-config
index 58ce364..aab52c1 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -112,13 +112,26 @@ if args.json_outputfile:
jsonconfig = []
jcfg = True

+# There is a 50 char limit on "bbname" but buildbot may append "_1", "_2" if multiple steps
+# with the same name exist in a build
def addentry(name, description, phase):
- jsonconfig.append({"name" : name, "bbname" : description[:45], "phase" : phase, "description" : description})
+ jsonconfig.append({"name" : name, "bbname" : description[:46], "phase" : phase, "description" : description})
+
+def addstepentry(name, taskdesc, shortname, description, detail, phase):
+ bbname = taskdesc
+ if shortname:
+ bbname = shortname + ": " + taskdesc
+ bbdesc = taskdesc
+ if description:
+ bbdesc = description
+ if detail:
+ bbdesc = bbdesc + ": " + detail
+ jsonconfig.append({"name" : name, "bbname" : bbname[:46], "phase" : phase, "description" : bbdesc})

if jcfg:
buildtools = utils.setup_buildtools_tarball(ourconfig, args.workername, None, checkonly=True)
if buildtools:
- addentry("buildtools", "Extract and setup buildtools tarball", "init")
+ addentry("buildtools", "Setup buildtools tarball", "init")
else:
utils.setup_buildtools_tarball(ourconfig, args.workername, args.builddir + "/../buildtools")
if args.phase == "init" and args.stepname == "buildtools":
@@ -218,15 +231,14 @@ if args.phase == "init" and args.stepname == "buildhistory-init":
sys.exit(0)

def handle_stepnum(stepnum):
+ shortdesc = utils.getconfigvar("shortname", ourconfig, args.target, stepnum) or ""
desc = utils.getconfigvar("description", ourconfig, args.target, stepnum) or ""
- if desc:
- desc = desc + ": "

# Add any layers specified
layers = utils.getconfiglist("ADDLAYER", ourconfig, args.target, stepnum)
if jcfg:
if layers:
- addentry("add-layers", "%sAdding layers %s" % (desc, str(layers)), str(stepnum))
+ addstepentry("add-layers", "Add layers", shortdesc, desc, str(layers), str(stepnum))
elif args.stepname == "add-layers":
for layer in layers:
bitbakecmd(args.builddir, "bitbake-layers add-layer %s" % layer, report, stepnum, args.stepname)
@@ -236,7 +248,7 @@ def handle_stepnum(stepnum):
# Generate the configuration files needed for this step
if utils.getconfigvar("WRITECONFIG", ourconfig, args.target, stepnum):
if jcfg:
- addentry("write-config", "%sWriting configuration files" % desc, str(stepnum))
+ addstepentry("write-config", "Write config", shortdesc, desc, None, str(stepnum))
elif args.stepname == "write-config":
runcmd([scriptsdir + "/setup-config", args.target, str(stepnum - 1), args.builddir, args.branchname, args.reponame, "-s", args.sstateprefix, "-b", args.buildappsrcrev])

@@ -244,7 +256,7 @@ def handle_stepnum(stepnum):
targets = utils.getconfigvar("BBTARGETS", ourconfig, args.target, stepnum)
if targets:
if jcfg:
- addentry("build-targets", "%sBuilding targets %s" % (desc, str(targets)), str(stepnum))
+ addstepentry("build-targets", "Build targets", shortdesc, desc, str(targets), str(stepnum))
elif args.stepname == "build-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, targets))
bitbakecmd(args.builddir, "bitbake %s -k" % targets, report, stepnum, args.stepname)
@@ -253,7 +265,7 @@ def handle_stepnum(stepnum):
sanitytargets = utils.getconfigvar("SANITYTARGETS", ourconfig, args.target, stepnum)
if sanitytargets:
if jcfg:
- addentry("test-targets", "%sRunning OEQA test targets %s" % (desc, str(sanitytargets)), str(stepnum))
+ addstepentry("test-targets", "QA targets", shortdesc, desc, str(sanitytargets), str(stepnum))
elif args.stepname == "test-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, sanitytargets))
bitbakecmd(args.builddir, "%s/checkvnc; DISPLAY=:1 bitbake %s -k" % (scriptsdir, sanitytargets), report, stepnum, args.stepname)
@@ -262,7 +274,7 @@ def handle_stepnum(stepnum):
cmds = utils.getconfiglist("EXTRACMDS", ourconfig, args.target, stepnum)
if jcfg:
if cmds:
- addentry("cmds", "%sRunning bitbake environment commands %s" % (desc, str(cmds)), str(stepnum))
+ addstepentry("cmds", "Run cmds", shortdesc, desc, str(cmds), str(stepnum))
elif args.stepname == "cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running command %s" % (stepnum, maxsteps, cmd))
@@ -271,7 +283,7 @@ def handle_stepnum(stepnum):
cmds = utils.getconfiglist("EXTRAPLAINCMDS", ourconfig, args.target, stepnum)
if jcfg:
if cmds:
- addentry("plain-cmds", "%sRunning commands %s" % (desc, str(cmds)), str(stepnum))
+ addstepentry("plain-cmds", "Run cmds", shortdesc, desc, str(cmds), str(stepnum))
elif args.stepname == "plain-cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running 'plain' command %s" % (stepnum, maxsteps, cmd))
@@ -279,7 +291,7 @@ def handle_stepnum(stepnum):

if jcfg:
if layers:
- addentry("remove-layers", "%sRemoving layers %s" % (desc, str(layers)), str(stepnum))
+ addstepentry("remove-layers", "Remove layers", shortdesc, desc, str(layers), str(stepnum))
elif args.stepname == "remove-layers":
# Remove any layers we added in a reverse order
for layer in reversed(layers):
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 09/15] scripts/shared-repo-unpack: Add flush call to update the output more regularly before buildtools

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 6e2825564c0b7b69f56e6e589ec15a1cebdb26d1)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/shared-repo-unpack | 1 +
1 file changed, 1 insertion(+)

diff --git a/scripts/shared-repo-unpack b/scripts/shared-repo-unpack
index 7dc250c..f08efa8 100755
--- a/scripts/shared-repo-unpack
+++ b/scripts/shared-repo-unpack
@@ -60,6 +60,7 @@ for repo in sorted(repos.keys()):
utils.fetchgitrepo(targetsubdir, repo, repos[repo], stashdir)
if args.publish_dir:
utils.publishrepo(targetsubdir, repo, args.publish_dir)
+ utils.flush()

utils.setup_buildtools_tarball(ourconfig, args.workername, args.abworkdir + "/buildtools")

--
2.25.1


[yocto-autobuilder-helper][dunfell V2 08/15] scripts/run-config: Remove redundant boilerplate json

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 2bb48042438f3154bbfa6fbc7f2c7556bfa7c762)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 31 +++++++++++++++++--------------
1 file changed, 17 insertions(+), 14 deletions(-)

diff --git a/scripts/run-config b/scripts/run-config
index 89506f5..58ce364 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -112,10 +112,13 @@ if args.json_outputfile:
jsonconfig = []
jcfg = True

+def addentry(name, description, phase):
+ jsonconfig.append({"name" : name, "bbname" : description[:45], "phase" : phase, "description" : description})
+
if jcfg:
buildtools = utils.setup_buildtools_tarball(ourconfig, args.workername, None, checkonly=True)
if buildtools:
- jsonconfig.append({"name" : "buildtools", "bbname" : "Extract and setup buildtools tarball", "phase" : "init"})
+ addentry("buildtools", "Extract and setup buildtools tarball", "init")
else:
utils.setup_buildtools_tarball(ourconfig, args.workername, args.builddir + "/../buildtools")
if args.phase == "init" and args.stepname == "buildtools":
@@ -208,14 +211,14 @@ def runcmd(cmd, *args, **kwargs):
bh_path, remoterepo, remotebranch, baseremotebranch = utils.getbuildhistoryconfig(ourconfig, args.builddir, args.target, args.reponame, args.branchname, 1)
if bh_path:
if jcfg:
- jsonconfig.append({"name" : "buildhistory-init", "bbname" : "Initialize buildhistory", "phase" : "init"})
+ addentry("buildhistory-init", "Initialize buildhistory", "init")
if args.phase == "init" and args.stepname == "buildhistory-init":
if bh_path:
runcmd([os.path.join(scriptsdir, "buildhistory-init"), bh_path, remoterepo, remotebranch, baseremotebranch])
sys.exit(0)

def handle_stepnum(stepnum):
- desc = utils.getconfigvar("description", ourconfig, args.target, stepnum)
+ desc = utils.getconfigvar("description", ourconfig, args.target, stepnum) or ""
if desc:
desc = desc + ": "

@@ -223,7 +226,7 @@ def handle_stepnum(stepnum):
layers = utils.getconfiglist("ADDLAYER", ourconfig, args.target, stepnum)
if jcfg:
if layers:
- jsonconfig.append({"name" : "add-layers", "bbname" : "%sAdding layers %s" % (desc, str(layers)), "phase" : str(stepnum)})
+ addentry("add-layers", "%sAdding layers %s" % (desc, str(layers)), str(stepnum))
elif args.stepname == "add-layers":
for layer in layers:
bitbakecmd(args.builddir, "bitbake-layers add-layer %s" % layer, report, stepnum, args.stepname)
@@ -233,7 +236,7 @@ def handle_stepnum(stepnum):
# Generate the configuration files needed for this step
if utils.getconfigvar("WRITECONFIG", ourconfig, args.target, stepnum):
if jcfg:
- jsonconfig.append({"name" : "write-config", "bbname" : "%sWriting configuration files" % desc, "phase" : str(stepnum)})
+ addentry("write-config", "%sWriting configuration files" % desc, str(stepnum))
elif args.stepname == "write-config":
runcmd([scriptsdir + "/setup-config", args.target, str(stepnum - 1), args.builddir, args.branchname, args.reponame, "-s", args.sstateprefix, "-b", args.buildappsrcrev])

@@ -241,7 +244,7 @@ def handle_stepnum(stepnum):
targets = utils.getconfigvar("BBTARGETS", ourconfig, args.target, stepnum)
if targets:
if jcfg:
- jsonconfig.append({"name" : "build-targets", "bbname" : "%sBuilding targets %s" % (desc, str(targets)), "phase" : str(stepnum)})
+ addentry("build-targets", "%sBuilding targets %s" % (desc, str(targets)), str(stepnum))
elif args.stepname == "build-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, targets))
bitbakecmd(args.builddir, "bitbake %s -k" % targets, report, stepnum, args.stepname)
@@ -250,7 +253,7 @@ def handle_stepnum(stepnum):
sanitytargets = utils.getconfigvar("SANITYTARGETS", ourconfig, args.target, stepnum)
if sanitytargets:
if jcfg:
- jsonconfig.append({"name" : "test-targets", "bbname" : "%sRunning OEQA test targets %s" % (desc, str(sanitytargets)), "phase" : str(stepnum)})
+ addentry("test-targets", "%sRunning OEQA test targets %s" % (desc, str(sanitytargets)), str(stepnum))
elif args.stepname == "test-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, sanitytargets))
bitbakecmd(args.builddir, "%s/checkvnc; DISPLAY=:1 bitbake %s -k" % (scriptsdir, sanitytargets), report, stepnum, args.stepname)
@@ -259,7 +262,7 @@ def handle_stepnum(stepnum):
cmds = utils.getconfiglist("EXTRACMDS", ourconfig, args.target, stepnum)
if jcfg:
if cmds:
- jsonconfig.append({"name" : "cmds", "bbname" : "%sRunning bitbake environment commands %s" % (desc, str(cmds)), "phase" : str(stepnum)})
+ addentry("cmds", "%sRunning bitbake environment commands %s" % (desc, str(cmds)), str(stepnum))
elif args.stepname == "cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running command %s" % (stepnum, maxsteps, cmd))
@@ -268,7 +271,7 @@ def handle_stepnum(stepnum):
cmds = utils.getconfiglist("EXTRAPLAINCMDS", ourconfig, args.target, stepnum)
if jcfg:
if cmds:
- jsonconfig.append({"name" : "plain-cmds", "bbname" : "%sRunning commands %s" % (desc, str(cmds)), "phase" : str(stepnum)})
+ addentry("plain-cmds", "%sRunning commands %s" % (desc, str(cmds)), str(stepnum))
elif args.stepname == "plain-cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running 'plain' command %s" % (stepnum, maxsteps, cmd))
@@ -276,7 +279,7 @@ def handle_stepnum(stepnum):

if jcfg:
if layers:
- jsonconfig.append({"name" : "remove-layers", "bbname" : "%sRemoving layers %s" % (desc, str(layers)), "phase" : str(stepnum)})
+ addentry("remove-layers", "%sRemoving layers %s" % (desc, str(layers)), str(stepnum))
elif args.stepname == "remove-layers":
# Remove any layers we added in a reverse order
for layer in reversed(layers):
@@ -299,7 +302,7 @@ else:


if jcfg:
- jsonconfig.append({"name" : "publish", "bbname" : "Publishing artefacts", "phase" : "finish"})
+ addentry("publish", "Publishing artefacts", "finish")
elif args.phase == "finish" and args.stepname == "publish":
if args.publish_dir:
hp.printheader("Running publish artefacts")
@@ -307,7 +310,7 @@ elif args.phase == "finish" and args.stepname == "publish":
sys.exit(0)

if jcfg:
- jsonconfig.append({"name" : "collect-results", "bbname" : "Collecting result files", "phase" : "finish"})
+ addentry("collect-results", "Collecting result files", "finish")
elif args.phase == "finish" and args.stepname == "collect-results":
if args.results_dir:
hp.printheader("Running results collection")
@@ -315,7 +318,7 @@ elif args.phase == "finish" and args.stepname == "collect-results":
sys.exit(0)

if jcfg:
- jsonconfig.append({"name" : "send-errors", "bbname" : "Sending error reports", "phase" : "finish"})
+ addentry("send-errors", "Sending error reports", "finish")
elif args.phase == "finish" and args.stepname == "send-errors":
if args.build_url and utils.getconfigvar("SENDERRORS", ourconfig, args.target, stepnum):
hp.printheader("Sending any error reports")
@@ -323,7 +326,7 @@ elif args.phase == "finish" and args.stepname == "send-errors":
sys.exit(0)

if jcfg:
- jsonconfig.append({"name" : "builddir-cleanup", "bbname" : "Cleaning up build directory", "phase" : "finish"})
+ addentry("builddir-cleanup", "Cleaning up build directory", "finish")
elif args.phase == "finish" and args.stepname == "builddir-cleanup":
if args.builddir and os.path.exists(args.builddir):
runcmd(["mv", args.builddir, args.builddir + "-renamed"])
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 07/15] config.json/run-config: Add human readable descriptions of steps

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit ce63e8f024834a670cea660c61be632191aed9b9)
Signed-off-by: Steve Sakoman <steve@...>
---
config.json | 6 ++++++
scripts/run-config | 30 +++++++++++++++++-------------
2 files changed, 23 insertions(+), 13 deletions(-)

diff --git a/config.json b/config.json
index b67ef03..1fcc45d 100644
--- a/config.json
+++ b/config.json
@@ -453,6 +453,7 @@
"MACHINE" : "qemux86-64",
"SDKMACHINE" : "x86_64",
"step1" : {
+ "description" : "x86_64 32bit multilib image with rpm",
"BBTARGETS" : "lib32-core-image-minimal",
"SANITYTARGETS" : "lib32-core-image-minimal:do_testimage",
"extravars" : [
@@ -463,6 +464,7 @@
]
},
"step2" : {
+ "description" : "x86_64 32bit multilib image with ipk",
"PACKAGE_CLASSES" : "package_ipk",
"BBTARGETS" : "lib32-core-image-minimal",
"SANITYTARGETS" : "lib32-core-image-minimal:do_testimage",
@@ -474,6 +476,7 @@
]
},
"step3" : {
+ "description" : "x86_64 64bit image and 32 bit multilibs with rpm",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
"extravars" : [
@@ -486,6 +489,7 @@
]
},
"step4" : {
+ "description" : "x86_64 64bit image and 32 bit multilibs with ipk",
"PACKAGE_CLASSES" : "package_ipk",
"BBTARGETS" : "core-image-sato",
"SANITYTARGETS" : "core-image-sato:do_testimage",
@@ -499,6 +503,7 @@
]
},
"step5" : {
+ "description" : "x86 building 64bit multilib image",
"MACHINE" : "qemux86",
"SDKMACHINE" : "i686",
"BBTARGETS" : "lib64-core-image-sato lib64-core-image-sato-sdk",
@@ -509,6 +514,7 @@
]
},
"step6" : {
+ "description" : "mips64 image using n32 as default",
"MACHINE" : "qemumips64",
"BBTARGETS" : "core-image-minimal core-image-minimal:do_populate_sdk",
"SANITYTARGETS" : "core-image-minimal:do_testimage core-image-minimal:do_testsdk",
diff --git a/scripts/run-config b/scripts/run-config
index 05c0579..89506f5 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -115,7 +115,7 @@ if args.json_outputfile:
if jcfg:
buildtools = utils.setup_buildtools_tarball(ourconfig, args.workername, None, checkonly=True)
if buildtools:
- jsonconfig.append({"name" : "buildtools", "description" : "Extract and setup buildtools tarball", "phase" : "init"})
+ jsonconfig.append({"name" : "buildtools", "bbname" : "Extract and setup buildtools tarball", "phase" : "init"})
else:
utils.setup_buildtools_tarball(ourconfig, args.workername, args.builddir + "/../buildtools")
if args.phase == "init" and args.stepname == "buildtools":
@@ -208,18 +208,22 @@ def runcmd(cmd, *args, **kwargs):
bh_path, remoterepo, remotebranch, baseremotebranch = utils.getbuildhistoryconfig(ourconfig, args.builddir, args.target, args.reponame, args.branchname, 1)
if bh_path:
if jcfg:
- jsonconfig.append({"name" : "buildhistory-init", "description" : "Initialize buildhistory", "phase" : "init"})
+ jsonconfig.append({"name" : "buildhistory-init", "bbname" : "Initialize buildhistory", "phase" : "init"})
if args.phase == "init" and args.stepname == "buildhistory-init":
if bh_path:
runcmd([os.path.join(scriptsdir, "buildhistory-init"), bh_path, remoterepo, remotebranch, baseremotebranch])
sys.exit(0)

def handle_stepnum(stepnum):
+ desc = utils.getconfigvar("description", ourconfig, args.target, stepnum)
+ if desc:
+ desc = desc + ": "
+
# Add any layers specified
layers = utils.getconfiglist("ADDLAYER", ourconfig, args.target, stepnum)
if jcfg:
if layers:
- jsonconfig.append({"name" : "add-layers", "description" : "Adding layers %s" % str(layers), "phase" : str(stepnum)})
+ jsonconfig.append({"name" : "add-layers", "bbname" : "%sAdding layers %s" % (desc, str(layers)), "phase" : str(stepnum)})
elif args.stepname == "add-layers":
for layer in layers:
bitbakecmd(args.builddir, "bitbake-layers add-layer %s" % layer, report, stepnum, args.stepname)
@@ -229,7 +233,7 @@ def handle_stepnum(stepnum):
# Generate the configuration files needed for this step
if utils.getconfigvar("WRITECONFIG", ourconfig, args.target, stepnum):
if jcfg:
- jsonconfig.append({"name" : "write-config", "description" : "Writing configuration files", "phase" : str(stepnum)})
+ jsonconfig.append({"name" : "write-config", "bbname" : "%sWriting configuration files" % desc, "phase" : str(stepnum)})
elif args.stepname == "write-config":
runcmd([scriptsdir + "/setup-config", args.target, str(stepnum - 1), args.builddir, args.branchname, args.reponame, "-s", args.sstateprefix, "-b", args.buildappsrcrev])

@@ -237,7 +241,7 @@ def handle_stepnum(stepnum):
targets = utils.getconfigvar("BBTARGETS", ourconfig, args.target, stepnum)
if targets:
if jcfg:
- jsonconfig.append({"name" : "build-targets", "description" : "Building targets %s" % str(targets), "phase" : str(stepnum)})
+ jsonconfig.append({"name" : "build-targets", "bbname" : "%sBuilding targets %s" % (desc, str(targets)), "phase" : str(stepnum)})
elif args.stepname == "build-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, targets))
bitbakecmd(args.builddir, "bitbake %s -k" % targets, report, stepnum, args.stepname)
@@ -246,7 +250,7 @@ def handle_stepnum(stepnum):
sanitytargets = utils.getconfigvar("SANITYTARGETS", ourconfig, args.target, stepnum)
if sanitytargets:
if jcfg:
- jsonconfig.append({"name" : "test-targets", "description" : "Running OEQA test targets %s" % str(sanitytargets), "phase" : str(stepnum)})
+ jsonconfig.append({"name" : "test-targets", "bbname" : "%sRunning OEQA test targets %s" % (desc, str(sanitytargets)), "phase" : str(stepnum)})
elif args.stepname == "test-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, sanitytargets))
bitbakecmd(args.builddir, "%s/checkvnc; DISPLAY=:1 bitbake %s -k" % (scriptsdir, sanitytargets), report, stepnum, args.stepname)
@@ -255,7 +259,7 @@ def handle_stepnum(stepnum):
cmds = utils.getconfiglist("EXTRACMDS", ourconfig, args.target, stepnum)
if jcfg:
if cmds:
- jsonconfig.append({"name" : "cmds", "description" : "Running bitbake environment commands %s" % str(cmds), "phase" : str(stepnum)})
+ jsonconfig.append({"name" : "cmds", "bbname" : "%sRunning bitbake environment commands %s" % (desc, str(cmds)), "phase" : str(stepnum)})
elif args.stepname == "cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running command %s" % (stepnum, maxsteps, cmd))
@@ -264,7 +268,7 @@ def handle_stepnum(stepnum):
cmds = utils.getconfiglist("EXTRAPLAINCMDS", ourconfig, args.target, stepnum)
if jcfg:
if cmds:
- jsonconfig.append({"name" : "plain-cmds", "description" : "Running commands %s" % str(cmds), "phase" : str(stepnum)})
+ jsonconfig.append({"name" : "plain-cmds", "bbname" : "%sRunning commands %s" % (desc, str(cmds)), "phase" : str(stepnum)})
elif args.stepname == "plain-cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running 'plain' command %s" % (stepnum, maxsteps, cmd))
@@ -272,7 +276,7 @@ def handle_stepnum(stepnum):

if jcfg:
if layers:
- jsonconfig.append({"name" : "remove-layers", "description" : "Removing layers %s" % str(layers), "phase" : str(stepnum)})
+ jsonconfig.append({"name" : "remove-layers", "bbname" : "%sRemoving layers %s" % (desc, str(layers)), "phase" : str(stepnum)})
elif args.stepname == "remove-layers":
# Remove any layers we added in a reverse order
for layer in reversed(layers):
@@ -295,7 +299,7 @@ else:


if jcfg:
- jsonconfig.append({"name" : "publish", "description" : "Publishing artefacts", "phase" : "finish"})
+ jsonconfig.append({"name" : "publish", "bbname" : "Publishing artefacts", "phase" : "finish"})
elif args.phase == "finish" and args.stepname == "publish":
if args.publish_dir:
hp.printheader("Running publish artefacts")
@@ -303,7 +307,7 @@ elif args.phase == "finish" and args.stepname == "publish":
sys.exit(0)

if jcfg:
- jsonconfig.append({"name" : "collect-results", "description" : "Collecting result files", "phase" : "finish"})
+ jsonconfig.append({"name" : "collect-results", "bbname" : "Collecting result files", "phase" : "finish"})
elif args.phase == "finish" and args.stepname == "collect-results":
if args.results_dir:
hp.printheader("Running results collection")
@@ -311,7 +315,7 @@ elif args.phase == "finish" and args.stepname == "collect-results":
sys.exit(0)

if jcfg:
- jsonconfig.append({"name" : "send-errors", "description" : "Sending error reports", "phase" : "finish"})
+ jsonconfig.append({"name" : "send-errors", "bbname" : "Sending error reports", "phase" : "finish"})
elif args.phase == "finish" and args.stepname == "send-errors":
if args.build_url and utils.getconfigvar("SENDERRORS", ourconfig, args.target, stepnum):
hp.printheader("Sending any error reports")
@@ -319,7 +323,7 @@ elif args.phase == "finish" and args.stepname == "send-errors":
sys.exit(0)

if jcfg:
- jsonconfig.append({"name" : "builddir-cleanup", "description" : "Cleaning up build directory", "phase" : "finish"})
+ jsonconfig.append({"name" : "builddir-cleanup", "bbname" : "Cleaning up build directory", "phase" : "finish"})
elif args.phase == "finish" and args.stepname == "builddir-cleanup":
if args.builddir and os.path.exists(args.builddir):
runcmd(["mv", args.builddir, args.builddir + "-renamed"])
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 06/15] scripts/run-config: Ensure logging to both logfile and stdout

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit b1bc4d64c2d0a7e61aea154635996b6b4a4d04c2)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 27 ++++++++++++++++-----------
1 file changed, 16 insertions(+), 11 deletions(-)

diff --git a/scripts/run-config b/scripts/run-config
index 25a4296..05c0579 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -153,6 +153,10 @@ def bitbakecmd(builddir, cmd, report, stepnum, stepname, oeenv=True):
except FileNotFoundError:
numreports = 0

+ def writelog(msg, a, b):
+ a.write(msg)
+ b.write(msg)
+
if oeenv:
cmd = ". ./oe-init-build-env; %s" % cmd

@@ -160,21 +164,22 @@ def bitbakecmd(builddir, cmd, report, stepnum, stepname, oeenv=True):
print("Would run '%s'" % cmd)
return

- print("Running '%s' with output to %s" % (cmd, log))
- flush()
+ with open(log, "a") as outf:
+ writelog("Running '%s' with output to %s\n" % (cmd, log), outf, sys.stdout)

- autoconf = builddir + "/conf/auto.conf"
- if os.path.exists(autoconf):
- with open(autoconf, "r") as inf, open(log, "a") as outf:
- outf.write("auto.conf settings:\n")
- for line in inf.readlines():
- outf.write(line)
- outf.write("\n")
+ autoconf = builddir + "/conf/auto.conf"
+ if os.path.exists(autoconf):
+ with open(autoconf, "r") as inf, open(log, "a") as outf:
+ writelog("auto.conf settings:\n", outf, sys.stdout)
+ for line in inf.readlines():
+ writelog(line, outf, sys.stdout)
+ writelog("\n", outf, sys.stdout)
+
+ flush()

with subprocess.Popen(cmd, shell=True, cwd=builddir + "/..", stdout=subprocess.PIPE, stderr=subprocess.STDOUT, bufsize=1) as p, open(log, 'ab') as f:
for line in p.stdout:
- sys.stdout.buffer.write(line)
- f.write(line)
+ writelog(line, f, sys.stdout.buffer)
sys.stdout.flush()
f.flush()
ret = p.wait()
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 05/15] scripts/run-config: Improve logfile naming

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 4a4c888f6618c3a7273c6dfe30b640e75e2b0de8)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 28 ++++++++++++----------------
1 file changed, 12 insertions(+), 16 deletions(-)

diff --git a/scripts/run-config b/scripts/run-config
index 116dd49..25a4296 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -131,11 +131,8 @@ def flush():
sys.stdout.flush()
sys.stderr.flush()

-lognum = 0
-def logname(path, stepnum, logsuffix):
- global lognum
- lognum += 1
- return path + "/command.log.%s%s" % (stepnum, logsuffix)
+def logname(path, stepnum, stepname):
+ return path + "/command-%s-%s.log" % (stepnum, stepname)

utils.mkdir(args.builddir)

@@ -146,10 +143,10 @@ utils.mkdir(errordir)

errorlogs = set()

-def bitbakecmd(builddir, cmd, report, stepnum, logsuffix, oeenv=True):
+def bitbakecmd(builddir, cmd, report, stepnum, stepname, oeenv=True):
global finalret
flush()
- log = logname(builddir, stepnum, logsuffix)
+ log = logname(builddir, stepnum, stepname)
errordir = utils.errorreportdir(builddir)
try:
numreports = len(os.listdir(errordir))
@@ -176,10 +173,9 @@ def bitbakecmd(builddir, cmd, report, stepnum, logsuffix, oeenv=True):

with subprocess.Popen(cmd, shell=True, cwd=builddir + "/..", stdout=subprocess.PIPE, stderr=subprocess.STDOUT, bufsize=1) as p, open(log, 'ab') as f:
for line in p.stdout:
- if not args.quietlogging:
- sys.stdout.buffer.write(line)
- sys.stdout.flush()
+ sys.stdout.buffer.write(line)
f.write(line)
+ sys.stdout.flush()
f.flush()
ret = p.wait()
if ret:
@@ -221,7 +217,7 @@ def handle_stepnum(stepnum):
jsonconfig.append({"name" : "add-layers", "description" : "Adding layers %s" % str(layers), "phase" : str(stepnum)})
elif args.stepname == "add-layers":
for layer in layers:
- bitbakecmd(args.builddir, "bitbake-layers add-layer %s" % layer, report, stepnum, 'a')
+ bitbakecmd(args.builddir, "bitbake-layers add-layer %s" % layer, report, stepnum, args.stepname)

flush()

@@ -239,7 +235,7 @@ def handle_stepnum(stepnum):
jsonconfig.append({"name" : "build-targets", "description" : "Building targets %s" % str(targets), "phase" : str(stepnum)})
elif args.stepname == "build-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, targets))
- bitbakecmd(args.builddir, "bitbake %s -k" % targets, report, stepnum, 'b')
+ bitbakecmd(args.builddir, "bitbake %s -k" % targets, report, stepnum, args.stepname)

# Execute the sanity targets for this configuration
sanitytargets = utils.getconfigvar("SANITYTARGETS", ourconfig, args.target, stepnum)
@@ -248,7 +244,7 @@ def handle_stepnum(stepnum):
jsonconfig.append({"name" : "test-targets", "description" : "Running OEQA test targets %s" % str(sanitytargets), "phase" : str(stepnum)})
elif args.stepname == "test-targets":
hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, sanitytargets))
- bitbakecmd(args.builddir, "%s/checkvnc; DISPLAY=:1 bitbake %s -k" % (scriptsdir, sanitytargets), report, stepnum, 'c')
+ bitbakecmd(args.builddir, "%s/checkvnc; DISPLAY=:1 bitbake %s -k" % (scriptsdir, sanitytargets), report, stepnum, args.stepname)

# Run any extra commands specified
cmds = utils.getconfiglist("EXTRACMDS", ourconfig, args.target, stepnum)
@@ -258,7 +254,7 @@ def handle_stepnum(stepnum):
elif args.stepname == "cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running command %s" % (stepnum, maxsteps, cmd))
- bitbakecmd(args.builddir, cmd, report, stepnum, 'd')
+ bitbakecmd(args.builddir, cmd, report, stepnum, args.stepname)

cmds = utils.getconfiglist("EXTRAPLAINCMDS", ourconfig, args.target, stepnum)
if jcfg:
@@ -267,7 +263,7 @@ def handle_stepnum(stepnum):
elif args.stepname == "plain-cmds":
for cmd in cmds:
hp.printheader("Step %s/%s: Running 'plain' command %s" % (stepnum, maxsteps, cmd))
- bitbakecmd(args.builddir, cmd, report, stepnum, 'd', oeenv=False)
+ bitbakecmd(args.builddir, cmd, report, stepnum, args.stepname, oeenv=False)

if jcfg:
if layers:
@@ -275,7 +271,7 @@ def handle_stepnum(stepnum):
elif args.stepname == "remove-layers":
# Remove any layers we added in a reverse order
for layer in reversed(layers):
- bitbakecmd(args.builddir, "bitbake-layers remove-layer %s" % layer, report, stepnum, 'a')
+ bitbakecmd(args.builddir, "bitbake-layers remove-layer %s" % layer, report, stepnum, args.stepname)

if not jcfg:
sys.exit(finalret)
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 04/15] run-config: Adapt to two pass execution

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 23d65680f8019bccc3fce20381dfcf49f265f601)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 171 +++++++++++++++++++++++++++++++++------------
scripts/utils.py | 5 +-
2 files changed, 130 insertions(+), 46 deletions(-)

diff --git a/scripts/run-config b/scripts/run-config
index 0b663df..116dd49 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -52,6 +52,19 @@ parser.add_argument('--workername',
action='store',
default=None,
help="the name of the worker the build is running on")
+parser.add_argument('-j', '--json-outputfile',
+ action='store',
+ default="",
+ help="the file to store json information about the build in")
+parser.add_argument('--stepname',
+ action='store',
+ default=None,
+ help="the name of the step to run")
+parser.add_argument('--phase',
+ action='store',
+ default=None,
+ help="the phase of the step to run")
+


args = parser.parse_args()
@@ -94,7 +107,19 @@ if args.target in ourconfig['overrides']:

hp.printheader("Target task %s has %d steps" % (args.target, maxsteps))

-utils.setup_buildtools_tarball(ourconfig, args.workername, args.builddir + "/../buildtools")
+jcfg = False
+if args.json_outputfile:
+ jsonconfig = []
+ jcfg = True
+
+if jcfg:
+ buildtools = utils.setup_buildtools_tarball(ourconfig, args.workername, None, checkonly=True)
+ if buildtools:
+ jsonconfig.append({"name" : "buildtools", "description" : "Extract and setup buildtools tarball", "phase" : "init"})
+else:
+ utils.setup_buildtools_tarball(ourconfig, args.workername, args.builddir + "/../buildtools")
+ if args.phase == "init" and args.stepname == "buildtools":
+ sys.exit(0)

logconfig = args.builddir + "/../bitbake/contrib/autobuilderlog.json"
print("Using BB_LOGCONFIG=%s" % logconfig)
@@ -181,70 +206,126 @@ def runcmd(cmd, *args, **kwargs):

bh_path, remoterepo, remotebranch, baseremotebranch = utils.getbuildhistoryconfig(ourconfig, args.builddir, args.target, args.reponame, args.branchname, 1)
if bh_path:
- runcmd([os.path.join(scriptsdir, "buildhistory-init"), bh_path, remoterepo, remotebranch, baseremotebranch])
-
-for stepnum in range(1, maxsteps + 1):
+ if jcfg:
+ jsonconfig.append({"name" : "buildhistory-init", "description" : "Initialize buildhistory", "phase" : "init"})
+if args.phase == "init" and args.stepname == "buildhistory-init":
+ if bh_path:
+ runcmd([os.path.join(scriptsdir, "buildhistory-init"), bh_path, remoterepo, remotebranch, baseremotebranch])
+ sys.exit(0)
+
+def handle_stepnum(stepnum):
# Add any layers specified
layers = utils.getconfiglist("ADDLAYER", ourconfig, args.target, stepnum)
- for layer in layers:
- bitbakecmd(args.builddir, "bitbake-layers add-layer %s" % layer, report, stepnum, 'a')
+ if jcfg:
+ if layers:
+ jsonconfig.append({"name" : "add-layers", "description" : "Adding layers %s" % str(layers), "phase" : str(stepnum)})
+ elif args.stepname == "add-layers":
+ for layer in layers:
+ bitbakecmd(args.builddir, "bitbake-layers add-layer %s" % layer, report, stepnum, 'a')

flush()
+
# Generate the configuration files needed for this step
if utils.getconfigvar("WRITECONFIG", ourconfig, args.target, stepnum):
- runcmd([scriptsdir + "/setup-config", args.target, str(stepnum - 1), args.builddir, args.branchname, args.reponame, "-s", args.sstateprefix, "-b", args.buildappsrcrev])
+ if jcfg:
+ jsonconfig.append({"name" : "write-config", "description" : "Writing configuration files", "phase" : str(stepnum)})
+ elif args.stepname == "write-config":
+ runcmd([scriptsdir + "/setup-config", args.target, str(stepnum - 1), args.builddir, args.branchname, args.reponame, "-s", args.sstateprefix, "-b", args.buildappsrcrev])

# Execute the targets for this configuration
targets = utils.getconfigvar("BBTARGETS", ourconfig, args.target, stepnum)
if targets:
- hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, targets))
- bitbakecmd(args.builddir, "bitbake %s -k" % targets, report, stepnum, 'b')
+ if jcfg:
+ jsonconfig.append({"name" : "build-targets", "description" : "Building targets %s" % str(targets), "phase" : str(stepnum)})
+ elif args.stepname == "build-targets":
+ hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, targets))
+ bitbakecmd(args.builddir, "bitbake %s -k" % targets, report, stepnum, 'b')

# Execute the sanity targets for this configuration
sanitytargets = utils.getconfigvar("SANITYTARGETS", ourconfig, args.target, stepnum)
if sanitytargets:
- hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, sanitytargets))
- bitbakecmd(args.builddir, "%s/checkvnc; DISPLAY=:1 bitbake %s -k" % (scriptsdir, sanitytargets), report, stepnum, 'c')
+ if jcfg:
+ jsonconfig.append({"name" : "test-targets", "description" : "Running OEQA test targets %s" % str(sanitytargets), "phase" : str(stepnum)})
+ elif args.stepname == "test-targets":
+ hp.printheader("Step %s/%s: Running bitbake %s" % (stepnum, maxsteps, sanitytargets))
+ bitbakecmd(args.builddir, "%s/checkvnc; DISPLAY=:1 bitbake %s -k" % (scriptsdir, sanitytargets), report, stepnum, 'c')

# Run any extra commands specified
cmds = utils.getconfiglist("EXTRACMDS", ourconfig, args.target, stepnum)
- for cmd in cmds:
- hp.printheader("Step %s/%s: Running command %s" % (stepnum, maxsteps, cmd))
- bitbakecmd(args.builddir, cmd, report, stepnum, 'd')
+ if jcfg:
+ if cmds:
+ jsonconfig.append({"name" : "cmds", "description" : "Running bitbake environment commands %s" % str(cmds), "phase" : str(stepnum)})
+ elif args.stepname == "cmds":
+ for cmd in cmds:
+ hp.printheader("Step %s/%s: Running command %s" % (stepnum, maxsteps, cmd))
+ bitbakecmd(args.builddir, cmd, report, stepnum, 'd')
+
cmds = utils.getconfiglist("EXTRAPLAINCMDS", ourconfig, args.target, stepnum)
- for cmd in cmds:
- hp.printheader("Step %s/%s: Running 'plain' command %s" % (stepnum, maxsteps, cmd))
- bitbakecmd(args.builddir, cmd, report, stepnum, 'd', oeenv=False)
-
- # Remove any layers we added in a reverse order
- for layer in reversed(layers):
- bitbakecmd(args.builddir, "bitbake-layers remove-layer %s" % layer, report, stepnum, 'a')
-
-if args.publish_dir:
- hp.printheader("Running publish artefacts")
- runcmd([scriptsdir + "/publish-artefacts", args.builddir, args.publish_dir, args.target])
-
-if args.results_dir:
- hp.printheader("Running results collection")
- runcmd([scriptsdir + "/collect-results", args.builddir, args.results_dir, args.target])
-
-if args.build_url and utils.getconfigvar("SENDERRORS", ourconfig, args.target, stepnum):
- hp.printheader("Sending any error reports")
- runcmd([scriptsdir + "/upload-error-reports", args.builddir, args.build_url])
-
-if args.builddir and os.path.exists(args.builddir):
- # Clean up our build directory if things were successful and we're not publishing anything
- # (keep published builds around for longer just in case we need them)
- if not finalret and not args.publish_dir:
- runcmd([scriptsdir + "/../janitor/clobberdir", args.builddir])
- else:
- # Rename any completed build directory so that other builds can't reference paths within it
+ if jcfg:
+ if cmds:
+ jsonconfig.append({"name" : "plain-cmds", "description" : "Running commands %s" % str(cmds), "phase" : str(stepnum)})
+ elif args.stepname == "plain-cmds":
+ for cmd in cmds:
+ hp.printheader("Step %s/%s: Running 'plain' command %s" % (stepnum, maxsteps, cmd))
+ bitbakecmd(args.builddir, cmd, report, stepnum, 'd', oeenv=False)
+
+ if jcfg:
+ if layers:
+ jsonconfig.append({"name" : "remove-layers", "description" : "Removing layers %s" % str(layers), "phase" : str(stepnum)})
+ elif args.stepname == "remove-layers":
+ # Remove any layers we added in a reverse order
+ for layer in reversed(layers):
+ bitbakecmd(args.builddir, "bitbake-layers remove-layer %s" % layer, report, stepnum, 'a')
+
+ if not jcfg:
+ sys.exit(finalret)
+
+if jcfg:
+ for stepnum in range(1, maxsteps + 1):
+ handle_stepnum(stepnum)
+else:
+ try:
+ stepnum = int(args.phase)
+ except ValueError:
+ stepnum = None
+
+ if stepnum is not None:
+ handle_stepnum(stepnum)
+
+
+if jcfg:
+ jsonconfig.append({"name" : "publish", "description" : "Publishing artefacts", "phase" : "finish"})
+elif args.phase == "finish" and args.stepname == "publish":
+ if args.publish_dir:
+ hp.printheader("Running publish artefacts")
+ runcmd([scriptsdir + "/publish-artefacts", args.builddir, args.publish_dir, args.target])
+ sys.exit(0)
+
+if jcfg:
+ jsonconfig.append({"name" : "collect-results", "description" : "Collecting result files", "phase" : "finish"})
+elif args.phase == "finish" and args.stepname == "collect-results":
+ if args.results_dir:
+ hp.printheader("Running results collection")
+ runcmd([scriptsdir + "/collect-results", args.builddir, args.results_dir, args.target])
+ sys.exit(0)
+
+if jcfg:
+ jsonconfig.append({"name" : "send-errors", "description" : "Sending error reports", "phase" : "finish"})
+elif args.phase == "finish" and args.stepname == "send-errors":
+ if args.build_url and utils.getconfigvar("SENDERRORS", ourconfig, args.target, stepnum):
+ hp.printheader("Sending any error reports")
+ runcmd([scriptsdir + "/upload-error-reports", args.builddir, args.build_url])
+ sys.exit(0)
+
+if jcfg:
+ jsonconfig.append({"name" : "builddir-cleanup", "description" : "Cleaning up build directory", "phase" : "finish"})
+elif args.phase == "finish" and args.stepname == "builddir-cleanup":
+ if args.builddir and os.path.exists(args.builddir):
runcmd(["mv", args.builddir, args.builddir + "-renamed"])

-if finalret:
- hp.printheader("There were %s failures" % finalret)
- hp.printheader("Failures in logfiles: %s" % " ".join(errorlogs))
- sys.exit(1)
+if args.json_outputfile:
+ with open(args.json_outputfile, "w") as f:
+ json.dump(jsonconfig, f, indent=4, sort_keys=True)

sys.exit(0)

diff --git a/scripts/utils.py b/scripts/utils.py
index c7eb6c7..bf1d989 100644
--- a/scripts/utils.py
+++ b/scripts/utils.py
@@ -415,7 +415,7 @@ def enable_buildtools_tarball(btdir):
if line in os.environ:
del os.environ[line]

-def setup_buildtools_tarball(ourconfig, workername, btdir):
+def setup_buildtools_tarball(ourconfig, workername, btdir, checkonly=False):
bttarball = None
if "buildtools" in ourconfig and workername:
btcfg = getconfig("buildtools", ourconfig)
@@ -424,6 +424,9 @@ def setup_buildtools_tarball(ourconfig, workername, btdir):
bttarball = btcfg[entry]
break

+ if checkonly:
+ return bttarball
+
btenv = None
if bttarball:
sha256 = None
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 03/15] scripts/run-config: If target is present default to 1 step

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit e183db413f3b67e0d45a2a9a697aa36b6c90601f)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 1 +
1 file changed, 1 insertion(+)

diff --git a/scripts/run-config b/scripts/run-config
index e600bf9..0b663df 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -84,6 +84,7 @@ elif args.build_type == "full":
maxsteps = 0
stepnum = 0
if args.target in ourconfig['overrides']:
+ maxsteps = 1
for v in ourconfig['overrides'][args.target]:
if v.startswith("step"):
n = int(v[4:])
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 02/15] scripts/run-config: Ensure stepnum has a value when there are no steps

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit afb6c5a88773d4da4a8dfd88f19654ca585efc95)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 1 +
1 file changed, 1 insertion(+)

diff --git a/scripts/run-config b/scripts/run-config
index ff56fbe..e600bf9 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -82,6 +82,7 @@ elif args.build_type == "full":

# Find out the number of steps this target has
maxsteps = 0
+stepnum = 0
if args.target in ourconfig['overrides']:
for v in ourconfig['overrides'][args.target]:
if v.startswith("step"):
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 01/15] scripts/run-config: Don't execute steps that don't exist!

Steve Sakoman
 

From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 290e1bc2ee18d5fa88aca84125fb6691db3db5f9)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/run-config | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/scripts/run-config b/scripts/run-config
index ce6072c..ff56fbe 100755
--- a/scripts/run-config
+++ b/scripts/run-config
@@ -81,7 +81,7 @@ elif args.build_type == "full":
ourconfig["HELPERSTMACHTARGS"] = "-a -t machine -t toolchain-user"

# Find out the number of steps this target has
-maxsteps = 1
+maxsteps = 0
if args.target in ourconfig['overrides']:
for v in ourconfig['overrides'][args.target]:
if v.startswith("step"):
--
2.25.1


[yocto-autobuilder-helper][dunfell V2 00/15] Patch review V2

Steve Sakoman
 

Please review this next set of patches for dunfell.

V2 removes:

scripts: Add runqemu-renice.c for renicing runqemu
scripts/generate-testresult-index: Update after 'posttrigger' renaming
broke the index generation
scripts/generate-testresult-index: Ensure backwards compatibility with
older layout
scripts/generate-testresult-index.py: Ensure we're not always
rerunning resulttool
scripts/generate-testresult-index: Improve index to list test reports,
ptest and buildperf separately
scripts/generate-testresult-index: Reorder buildhistory to improve
display
scripts/generate-testresult-index.py: Use bulma css to improve the
look of the index
config.json: Use buildtools tarball on debian9

The following changes since commit ef52b284e8cbe90c18fdab6a0d6fa8095a2c4ed9:

send-qa-email: Save the QA email in case it doesn't reach the mailing lists. (2021-02-23 10:24:14 +0000)

are available in the Git repository at:

git://git.yoctoproject.org/yocto-autobuilder-helper contrib/sakoman
http://git.yoctoproject.org/cgit.cgi/yocto-autobuilder-helper/log/?h=contrib/sakoman

Richard Purdie (14):
scripts/run-config: Don't execute steps that don't exist!
scripts/run-config: Ensure stepnum has a value when there are no steps
scripts/run-config: If target is present default to 1 step
run-config: Adapt to two pass execution
scripts/run-config: Improve logfile naming
scripts/run-config: Ensure logging to both logfile and stdout
config.json/run-config: Add human readable descriptions of steps
scripts/run-config: Remove redundant boilerplate json
scripts/shared-repo-unpack: Add flush call to update the output more
regularly before buildtools
config.json/run-config: Add support for shortnames and descriptions
config.json: Unbreak qa-extras locked sigs test
config.json: Add further descriptions
scripts/run-config: Disable output buffering
config.json: Split reproduciblity tests into their own target

Ross Burton (1):
config: build and test SDKs when using package_deb

config.json | 101 +++++++++++++---
scripts/run-config | 235 ++++++++++++++++++++++++++-----------
scripts/shared-repo-unpack | 1 +
scripts/utils.py | 5 +-
4 files changed, 259 insertions(+), 83 deletions(-)

--
2.25.1


Re: [yocto-autobuilder-helper][dunfell 01/23] scripts: Add runqemu-renice.c for renicing runqemu

Steve Sakoman
 

On Thu, Mar 25, 2021 at 4:20 AM Richard Purdie
<richard.purdie@...> wrote:

On Wed, 2021-03-24 at 14:39 -1000, Steve Sakoman wrote:
From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 838be1a00c0383b63d1ab60aa991919404b82655)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/runqemu-renice.c | 44 ++++++++++++++++++++++++++++++++++++++++
1 file changed, 44 insertions(+)
create mode 100644 scripts/runqemu-renice.c
This is another one which is only relevant to master, it isn't used from the
older branches.
OK, removed!

Steve

[Basically the autobuilder has some code which it pulls from the helper
but uses it from the master branch, it isn't release specific. This is
the janitor, index code and renice offhand, there may be other bits I forget]

Cheers,

Richard


Re: [yocto-autobuilder-helper][dunfell 01/23] scripts: Add runqemu-renice.c for renicing runqemu

Richard Purdie
 

On Wed, 2021-03-24 at 14:39 -1000, Steve Sakoman wrote:
From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 838be1a00c0383b63d1ab60aa991919404b82655)
Signed-off-by: Steve Sakoman <steve@...>
---
 scripts/runqemu-renice.c | 44 ++++++++++++++++++++++++++++++++++++++++
 1 file changed, 44 insertions(+)
 create mode 100644 scripts/runqemu-renice.c
This is another one which is only relevant to master, it isn't used from the
older branches.

[Basically the autobuilder has some code which it pulls from the helper
but uses it from the master branch, it isn't release specific. This is 
the janitor, index code and renice offhand, there may be other bits I forget]

Cheers,

Richard


Re: [yocto-autobuilder-helper][dunfell 07/23] scripts/generate-testresult-index.py: Use bulma css to improve the look of the index

Steve Sakoman
 

On Thu, Mar 25, 2021 at 4:04 AM Richard Purdie
<richard.purdie@...> wrote:

On Wed, 2021-03-24 at 14:39 -1000, Steve Sakoman wrote:
From: Richard Purdie <richard.purdie@...>

Signed-off-by: Richard Purdie <richard.purdie@...>
(cherry picked from commit 4b8eab92ee1f68ec8cd680c62e40b17006fa6efc)
Signed-off-by: Steve Sakoman <steve@...>
---
scripts/generate-testresult-index.py | 34 ++++++++++++++++++++--------
1 file changed, 24 insertions(+), 10 deletions(-)
I'm torn on this and the other generate-testresult-index patches. These
are only ever run from the master branch so in the context of dunfell,
they don't matter. Equally, they therefore don't hurt anything...
I wasn't really sure whether these ran in dunfell, thanks for
confirming that they don't.

I'll drop the generate-test-result-index patches from the pull request too.

Steve

4521 - 4540 of 57387