[PATCH yocto-autobuilder-helper] run-docs-build: build from tags dynamically instead of static list
All new releases are Sphinx ready so we exclude old tags and build for all the rest.
Signed-off-by: Michael Halstead <mhalstead@...> --- scripts/run-docs-build | 14 ++++++++++---- 1 file changed, 10 insertions(+), 4 deletions(-)
diff --git a/scripts/run-docs-build b/scripts/run-docs-build index 910f03d..13df34a 100755 --- a/scripts/run-docs-build +++ b/scripts/run-docs-build @@ -7,6 +7,7 @@ ypdocs=$2/documentation/ bbdocs=$3/doc/ docs_buildtools=/srv/autobuilder/autobuilder.yoctoproject.org/pub/buildtools/x86_64-buildtools-docs-nativesdk-standalone-3.2+snapshot-20201105.sh outputdir=$builddir/output +excluded_tags="yocto-3.1.4 yocto-3.1.3 yocto-3.1.2 yocto-3.1.1 yocto-3.1 yocto-3.0.1 yocto-3.0 yocto-2.6.4 yocto-2.6.3 yocto-2.7.1 yocto-2.6.2 yocto-2.7 yocto-2.6.1 yocto-2.6 yocto-2.5.2 yocto-2.5.1 yocto-2.4.4 yocto-2.4.3 yocto-2.5 yocto-2.3.4 yocto-1.0.2 yocto-1.1.2 yocto-1.2.2 yocto-1.2.1 yocto-1.3 yocto-1.3.1 yocto-1.3.2 yocto-1.4.3 yocto-1.4.2 yocto-1.4.1 yocto-1.4 yocto-2.1.3 yocto-2.4.2 yocto-2.1.1 yocto-2.1.2 yocto-2.0.3 yocto-1.8.2 yocto-2.2.3 yocto-2.4.1 yocto-2.3.3 yocto-2.3.2 yocto-2.4 yocto-2.2.2 yocto-2.3.1 yocto-2.3 yocto-2.2.1 yocto-2.0.2 yocto-2.2 yocto-2.1 yocto-2.0.1 yocto-2.0 yocto-1.8.1 yocto-1.7.3 yocto-1.6.3 yocto-1.7.2 yocto-1.8 yocto-1.5.1" cd $builddir @@ -77,13 +78,18 @@ for branch in dunfell gatesgarth hardknott; do done # Yocto Project releases/tags -for tag in 3.1.5 3.1.6 3.2 3.2.1 3.2.2 3.2.3; do +cd $ypdocs +for tag in $(git tag -l |grep 'yocto-' |sort); do + if [[ $excluded_tags =~ $tag ]]; then + continue + fi cd $ypdocs - git checkout yocto-$tag + git checkout $tag make clean make publish - mkdir $outputdir/$tag - cp -r ./_build/final/* $outputdir/$tag + version=$(echo $tag | cut -c7-) + mkdir $outputdir/$version + cp -r ./_build/final/* $outputdir/$version done # Update switchers.js with the copy from master ypdocs -- 2.30.2
|
|
Re: #yocto #sdk -XILINX/vivado dependencies
#yocto
#sdk

Khem Raj
there is KERNEL_MODULE_AUTOLOAD which could be used to load modules on boot, don't know if that suffices to what you need but worth looking into.
On Tue, Apr 6, 2021 at 12:47 PM Monsees, Steven C (US) via lists.yoctoproject.org <steven.monsees=baesystems.com@...> wrote:
toggle quoted messageShow quoted text
Working with zeus, aarch64, with Xilinx vivado dependencies…
Kerenl image and bootapp build and run correctly, need to be able to build EXT SDK,,,
How do I incorporate the dependencies of the low level Xilinx FPGA support (i.e. “vivado”) into the Ext SDK build env ?
Is there a way to build in support so that the “module load” command would be usable from the “.conf” or the env-setup script, (i.e. “module load vivado…” ?
Thanks,
Steve
|
|
#yocto #sdk -XILINX/vivado dependencies
#yocto
#sdk
Working with zeus, aarch64, with Xilinx vivado dependencies…
Kerenl image and bootapp build and run correctly, need to be able to build EXT SDK,,,
How do I incorporate the dependencies of the low level Xilinx FPGA support (i.e. “vivado”) into the Ext SDK build env ?
Is there a way to build in support so that the “module load” command would be usable from the “.conf” or the env-setup script, (i.e. “module load vivado…” ?
Thanks,
Steve
|
|
[meta-zephyr][PATCH 1/1] zephyr-flash-pyocd.bbclass: Fix problems with flashing particular boards
From: Zbigniew Bodek <zbigniew.bodek@...>
By default, pyocd uses generic target type called "cortex_m" which should be able to connect and debug but not flash the memory. Normally pyocd would warn us of using default target instead of proper one but this message wasn't displayed.
Despite not providing target type, flashing process succeeded but results were undefined. On Nitrogen, sometimes it worked (especially for small images) and sometimes the programmed device crashed miserably.
Fix flashing operation by providing pyocd target type acquired from the conditional PYOCD_TARGET variable declared for each machine (chip family).
Signed-off-by: Zbigniew Bodek <zbigniew.bodek@...> --- classes/zephyr-flash-pyocd.bbclass | 10 +++++++++- conf/machine/include/nrf52832.inc | 3 +++ 2 files changed, 12 insertions(+), 1 deletion(-)
diff --git a/classes/zephyr-flash-pyocd.bbclass b/classes/zephyr-flash-pyocd.bbclass index a873be4..6517945 100644 --- a/classes/zephyr-flash-pyocd.bbclass +++ b/classes/zephyr-flash-pyocd.bbclass @@ -27,6 +27,9 @@ python do_flash_usb() { if not ids: bb.fatal("No probe requested for programming. Make sure PYOCD_FLASH_IDS is set.") + # Fetch target type to pass to the ConnectHelper + target = d.getVar('PYOCD_TARGET') + # Program each ID for id in ids: bb.plain(f"Attempting to flash {os.path.basename(image)} to board {d.getVar('BOARD')} [{id}]") @@ -35,7 +38,12 @@ python do_flash_usb() { now = 0 step = 3 while True: - session = ConnectHelper.session_with_chosen_probe(blocking=False, return_first=True, unique_id=id) + if target is not None: + session = ConnectHelper.session_with_chosen_probe(blocking=False, return_first=True, unique_id=id, target_override=target) + else: + bb.warn(f"Target type not provided. Flashing may fail or result in an undefined behavior.") + session = ConnectHelper.session_with_chosen_probe(blocking=False, return_first=True, unique_id=id) + if session: break if now >= timeout: diff --git a/conf/machine/include/nrf52832.inc b/conf/machine/include/nrf52832.inc index 73e628a..e938aa6 100644 --- a/conf/machine/include/nrf52832.inc +++ b/conf/machine/include/nrf52832.inc @@ -8,3 +8,6 @@ require conf/machine/include/tune-cortexm4.inc MACHINEOVERRIDES =. "nordic:" TUNE_FEATURES = "armv7m cortexm4" + +# Target type for this machine used by Pyocd +PYOCD_TARGET = "nrf52" -- 2.25.1
|
|
[meta-zephyr][PATCH 0/1] Fix flashing with pyocd
From: Wojciech Zmuda <wojciech.zmuda@...>
I'm submitting this patch with the consent of the original author Zbigniew Bodek <zbigniew.bodek@...>
The flashing process occurred to contain an undefined behavior, resulting in flashing garbage data to the connected board. It rarely occured with small images (e.g. zephyr-hello-world or zephyr-philosophers samples) but became visible with custom applications.
The exact mechanism is described in the commit log of the patch.
Zbigniew Bodek (1): zephyr-flash-pyocd.bbclass: Fix problems with flashing particular boards
classes/zephyr-flash-pyocd.bbclass | 10 +++++++++- conf/machine/include/nrf52832.inc | 3 +++ 2 files changed, 12 insertions(+), 1 deletion(-)
-- 2.25.1
|
|
Re: [PATCH yocto-autobuilder-helper] run-docs-build: add 3.2.3 release to docs build

Nicolas Dechesne
On Tue, Apr 6, 2021 at 7:06 PM Michael Halstead < mhalstead@...> wrote: Signed-off-by: Michael Halstead <mhalstead@...>
Thanks for the patch.
---
scripts/run-docs-build | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/scripts/run-docs-build b/scripts/run-docs-build
index 4af23fd..910f03d 100755
--- a/scripts/run-docs-build
+++ b/scripts/run-docs-build
@@ -77,7 +77,7 @@ for branch in dunfell gatesgarth hardknott; do
done
# Yocto Project releases/tags
-for tag in 3.1.5 3.1.6 3.2 3.2.1 3.2.2; do
+for tag in 3.1.5 3.1.6 3.2 3.2.1 3.2.2 3.2.3; do
cd $ypdocs
git checkout yocto-$tag
make clean
--
2.30.2
|
|
[PATCH yocto-autobuilder-helper] run-docs-build: add 3.2.3 release to docs build
Signed-off-by: Michael Halstead <mhalstead@...> --- scripts/run-docs-build | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/scripts/run-docs-build b/scripts/run-docs-build index 4af23fd..910f03d 100755 --- a/scripts/run-docs-build +++ b/scripts/run-docs-build @@ -77,7 +77,7 @@ for branch in dunfell gatesgarth hardknott; do done # Yocto Project releases/tags -for tag in 3.1.5 3.1.6 3.2 3.2.1 3.2.2; do +for tag in 3.1.5 3.1.6 3.2 3.2.1 3.2.2 3.2.3; do cd $ypdocs git checkout yocto-$tag make clean -- 2.30.2
|
|
Re: [OE-core] WiFi P2P support

Khem Raj
On 4/6/21 12:33 AM, JH wrote: Hi, I am building WiFi wlan Linux image on Zeus using Linux WiFi linux-firmware-sd8801 and connman, I need to add WiFi P2P support, is it sufficient to create wpa-supplicant_2.9.bbappend to enable CONFIG_P2P=y (it is disabled in wpa-supplicant_2.9 defconfig)? Do I need to make other changes on OE configurations?
I dont think there is a canned recipe for P2P so you will discover on way. I would suggest to look at https://w1.fi/cgit/hostap/tree/wpa_supplicant/README-P2POnce you get working it will be valuable if you can then share what steps you needed for it to work end to end. Thank you very much. Kind regards, jupier
|
|
Yocto Project Status WW14`21
Current Dev Position: YP 3.3 M4 (Feature Freeze) Next Deadline: 5th April 2021 YP 3.3 M4 build Next Team Meetings: Key Status/Updates: - We are now close to building YP 3.3 rc1 for the final release. That build may happen later today if three are no more urgent pending changes. The biggest blockers on release may be the changelog/release notes and migration guide as those have not been started yet.
- YP 3.2.3 was released.
- We did merge a change to oeqa/runqemu to allow the qemu images to be run from a tmpfs. This should mean writes aren’t held up in the IO queue which we believe may have been a cause of the intermittent failures. Unfortunately we can’t use cgroups as the versions on most of the autobuilder workers isn’t recent enough for IO priorities, even ignoring the permission issues so this tmpfs solution seemed like the best option and was not very invasive to implement. It does mean we now see the “qemu didn’t start in 120s” error instead much more regularly if the copy into tmpfs is locked.
- There are a number of patches queued in master-next but most are version upgrades which will wait for 3.4 to open for development.
- Intermittent autobuilder issues continue to occur and are now at a record high level. You can see the list of failures we’re continuing to see by searching for the “AB-INT” tag in bugzilla: https://bugzilla.yoctoproject.org/buglist.cgi?quicksearch=AB-INT
We are working to identify the load pattern on the infrastructure that seems to trigger these. Ways to contribute: YP 3.3 Milestone Dates: - YP 3.3 M4 build date 2021/04/05
- YP 3.3 M4 Release date 2021/04/30
Planned upcoming dot releases: - YP 3.2.3 has been released.
- YP 3.1.7 build date 2021/03/29
- YP 3.1.7 release date 2021/04/09
- YP 3.2.4 build date 2021/05/3
- YP 3.2.4 release date 2021/05/14
- YP 3.1.8 build date 2021/05/17
- YP 3.1.8 release date 2021/05/28
Tracking Metrics: The Yocto Project’s technical governance is through its Technical Steering Committee, more information is available at: https://wiki.yoctoproject.org/wiki/TSC The Status reports are now stored on the wiki at: https://wiki.yoctoproject.org/wiki/Weekly_Status [If anyone has suggestions for other information you’d like to see on this weekly status update, let us know!] Thanks, Stephen K. Jolley Yocto Project Program Manager ( Cell: (208) 244-4460 * Email: sjolley.yp.pm@...
|
|
[yocto-autobuilder2][PATCH v2] README-Guide.md: Add multi-node content, extra config info
The instructions in README-Guide.md are a good starting point, but there are some additional guidelines in this patch for setting up worker nodes which may be useful to others who want to run their own Autobuilder instance. Specifically, it adds: - Section 1.3 on adding additional worker nodes to a cluster - Section 1.4 on setting up an NFS share for the controller and workers to reference - A link to the Yocto Manual where the requirements to support running builds on Ubuntu/Debian systems is listed - A note to make sure that any new users (pokybuild3) created for the Autobuilder have LANG set in their bash profile Signed-off-by: Trevor Gamblin <trevor.gamblin@...> --- README-Guide.md | 108 +++++++++++++++++++++++++++++++++++++++++++++++- 1 file changed, 106 insertions(+), 2 deletions(-) diff --git a/README-Guide.md b/README-Guide.md index 21dd7c1..832996f 100644 --- a/README-Guide.md +++ b/README-Guide.md @@ -6,7 +6,8 @@ This guide will walk through how to install a stand-alone autobuilder controller The final outputs of this section are a controller and worker installed in the same server, ready for trimming back to an individual organization's needs. - > NOTE: The guide assumes that your host OS has the packages installed to support BitBake for the release(s) you are targeting. Please refer to the Yocto manual for those packages. + > NOTE: The guide assumes that your host OS has the packages installed to support BitBake for the release(s) you are targeting. Please refer to the Yocto manual for those packages: + https://docs.yoctoproject.org/ref-manual/system-requirements.html#ubuntu-and-debian The latest version of BuildBot is written in Python 3, so installation via pip3: @@ -43,6 +44,14 @@ yocto-controller/yoctoabb yocto-worker ``` +Before proceeding, make sure that the following is added to the +pokybuild3 user's exports (e.g. in .bashrc), or builds will fail after +being triggered: + +``` +export LANG=en_US.UTF-8 +``` + Next, we need to update the `yocto-controller/yoctoabb/master.cfg` towards the bottom where the `title`, `titleURL`, and `buildbotURL` are all set. This is also where you would specify a different password for binding workers to the master. Then, we need to update the `yocto-controller/yoctoabb/config.py` to include our worker. In that file, find the line where `workers` is set and add: ["example-worker"]. _NOTE:_ if your worker's name is different, use that here. Section 3.1 discusses how to further refine this list of workers. @@ -61,7 +70,17 @@ Set `BASE_HOMEDIR` should be your build user's home directory. (There are she > NOTE: The way the build step is written, the worker will pull a fresh copy of the helper from the server. Therefore these configuration files must be committed to the `yocto-autobuilder-helper` repo location you have specified in `yoctoabb/config.py` because the worker is given a build step that pulls from that repo (see `yoctoabb/builders.py`). -Finally, as root, add the `yocto-*.service` files to `/lib/systemd/system` (See Appendix A). Run: `systemctl daemon-reload`. You should now be able to successfully start these services (e.g., `sudo systemctl start yocto-*`). The controller may take up to 15 seconds to start. +Finally, it is time to start the Autobuilder. There are two ways to do this: + +1. As the pokybuild3 user, run the following: + +``` +yocto-autobuilder-helper/janitor/ab-janitor& +buildbot start yocto-controller +buildbot-worker start yocto-worker +``` + +2. As root, add the `yocto-*.service` files to `/lib/systemd/system` (See Appendix A). Run: `systemctl daemon-reload`. You should now be able to successfully start these services (e.g., `sudo systemctl start yocto-*`). The controller may take up to 15 seconds to start. ### 1.1) Configuring the Worker's Hash Equivalency Server @@ -112,6 +131,91 @@ sudo /home/pokybuild3/yocto-worker/qemuarm/build/scripts/runqemu-gen-tapdevs \ In the above command, we assume the a build named qemuarm failed. The value of 8 is the number of tap interfaces to create on the worker. +### 1.3) Adding Dedicated Worker Nodes + +Running both the controller and the worker together on a single machine +can quickly result in long build times and an unresponsive web UI, +especially if you plan on running any of the more comprehensive builders +(e.g. a-full). Additional workers can be added to the cluster by +following the steps in Section 1, except that the yocto-controller steps +do not need to be repeated. For example, to add a new worker +"ala-blade51" to an Autobuilder cluster with a yocto-controller at the +IP address 147.11.105.72: + +1. On the yocto-controller host, add the name of the new worker to a worker +list (or create a new one) e.g. 'workers_wrlx = ["ala-blade51"]' and +make sure that it is added to the "workers" list. + +2. On the new worker node: + +``` +useradd -m --system pokybuild3 +cd /home/pokybuild3 +mkdir -p git/trash +buildbot-worker create-worker -r --umask=0o22 yocto-worker 147.11.105.72 ala-blade51 pass +chown -R pokybuild3:pokybuild3 /home/pokybuild3 +``` + + > Note 1: The URL/IP given to the create-worker command must match the +host running the yocto-controller. + + > Note 2: The "pass" argument given to the create-worker command must +match the common "worker_pass" variable set in yocto-controller/yoctoabb/config.py. + +3. Once you have finished with configuration, you can run the following +and the worker should successfully join the cluster and become available +to use with the builders, where "yocto-worker/" is the directory created +in step 2: + +``` +buildbot-worker start yocto-worker/ +``` + + + +### 1.4) Configuring NFS for the Autobuilder Cluster + +The Yocto Autobuilder relies on NFS to distribute a common sstate cache +and other outputs between nodes. A similar configuration can be +deployed by performing the steps given below, which were written for +Ubuntu 18.04.In order for both the controller and worker nodes to be able +to access the NFS share without issue, the "pokybuild3" user on all +systems must have the same UID/GID, or sufficient permissions must be +granted on the /srv/autobuilder path (or wherever you modified the config +files to point to). The following instructions assume a controller node +at 147.11.105.72 and a single worker node at 147.11.105.71, but +additional worker nodes can be added as needed (see the previous +section). + +1. On the NFS host: + +``` +sudo apt install -y nfs-kernel-server +sudo mkdir -p /srv/autobuilder/autobuilder.yoctoproject.org/ +sudo chown -R pokybuild3:pokybuild3 /srv/autobuilder/autobuilder.yoctoproject.org +``` +2. Add the following to /etc/exports, replacing the path and IP fields + as necessary for each client node: +``` +/srv/autobuilder/autobuilder.yoctoproject.org/ 147.11.105.71(rw,sync,no_subtree_check) +``` + +3. Run +``` +sudo systemctl restart nfs-kernel-server +``` + +4. Adjust the firewall (if required). Example: +``` +sudo ufw allow from 147.11.105.71 to any port nfs +``` + +5. On the client node(s): +``` +sudo apt-get install nfs-common +sudo mount 147.11.105.72:/srv/autobuilder/autobuilder.yoctoproject.org/ /srv/autobuilder/autobuilder.yoctoproject.org/ +``` + ## 2) Basics This section is an overview of operation and a few basic configuration file relationships. See Section 3 for more detailed instructions. -- 2.30.2
|
|
Re: [yocto-autobuilder2][PATCH] README-Guide.md: Add multi-node content, extra config info
On 2021-04-06 9:44 a.m., Trevor Gamblin
wrote:
The instructions in README-Guide.md are a good starting point, but there
are some additional guidelines in this patch for setting up worker nodes
which may be useful to others who want to run their own Autobuilder
instance. Specifically, it adds:
Sending a v2 - adding an extra line or two about starting the
worker(s).
- Section 1.3 on adding additional worker nodes to a cluster
- Section 1.4 on setting up an NFS share for the controller and workers
to reference
- A link to the Yocto Manual where the requirements to support running
builds on Ubuntu/Debian systems is listed
- A note to make sure that any new users (pokybuild3) created for the
Autobuilder have LANG set in their bash profile
Signed-off-by: Trevor Gamblin <trevor.gamblin@...>
---
README-Guide.md | 86 ++++++++++++++++++++++++++++++++++++++++++++++++-
1 file changed, 85 insertions(+), 1 deletion(-)
diff --git a/README-Guide.md b/README-Guide.md
index 21dd7c1..d976fdd 100644
--- a/README-Guide.md
+++ b/README-Guide.md
@@ -6,7 +6,8 @@ This guide will walk through how to install a stand-alone autobuilder controller
The final outputs of this section are a controller and worker installed in the same server, ready for trimming back to an individual organization's needs.
- > NOTE: The guide assumes that your host OS has the packages installed to support BitBake for the release(s) you are targeting. Please refer to the Yocto manual for those packages.
+ > NOTE: The guide assumes that your host OS has the packages installed to support BitBake for the release(s) you are targeting. Please refer to the Yocto manual for those packages:
+ https://docs.yoctoproject.org/ref-manual/system-requirements.html#ubuntu-and-debian
The latest version of BuildBot is written in Python 3, so installation via pip3:
@@ -43,6 +44,14 @@ yocto-controller/yoctoabb
yocto-worker
```
+Before proceeding, make sure that the following is added to the
+pokybuild3 user's exports (e.g. in .bashrc), or builds will fail after
+being triggered:
+
+```
+export LANG=en_US.UTF-8
+```
+
Next, we need to update the `yocto-controller/yoctoabb/master.cfg` towards the bottom where the `title`, `titleURL`, and `buildbotURL` are all set. This is also where you would specify a different password for binding workers to the master.
Then, we need to update the `yocto-controller/yoctoabb/config.py` to include our worker. In that file, find the line where `workers` is set and add: ["example-worker"]. _NOTE:_ if your worker's name is different, use that here. Section 3.1 discusses how to further refine this list of workers.
@@ -112,6 +121,81 @@ sudo /home/pokybuild3/yocto-worker/qemuarm/build/scripts/runqemu-gen-tapdevs \
In the above command, we assume the a build named qemuarm failed. The value of 8 is the number of tap interfaces to create on the worker.
+### 1.3) Adding Dedicated Worker Nodes
+
+Running both the controller and the worker together on a single machine
+can quickly result in long build times and an unresponsive web UI,
+especially if you plan on running any of the more comprehensive builders
+(e.g. a-full). Additional workers can be added to the cluster by
+following the steps in Section 1, except that the yocto-controller steps
+do not need to be repeated. For example, to add a new worker
+"ala-blade51" to an Autobuilder cluster with a yocto-controller at the
+IP address 147.11.105.72:
+
+1. On the yocto-controller host, add the name of the new worker to a worker
+list (or create a new one) e.g. 'workers_wrlx = ["ala-blade51"]' and
+make sure that it is added to the "workers" list.
+
+2. On the new worker node:
+
+```
+useradd -m --system pokybuild3
+cd /home/pokybuild3
+mkdir -p git/trash
+buildbot-worker create-worker -r --umask=0o22 yocto-worker 147.11.105.72 ala-blade51 pass
+chown -R pokybuild3:pokybuild3 /home/pokybuild3
+```
+
+ > Note 1: The URL/IP given to the create-worker command must match the
+host running the yocto-controller.
+
+ > Note 2: The "pass" argument given to the create-worker command must
+match the common "worker_pass" variable set in yocto-controller/yoctoabb/config.py.
+
+
+### 1.4) Configuring NFS for the Autobuilder Cluster
+
+The Yocto Autobuilder relies on NFS to distribute a common sstate cache
+and other outputs between nodes. A similar configuration can be
+deployed by performing the steps given below, which were written for
+Ubuntu 18.04.In order for both the controller and worker nodes to be able
+to access the NFS share without issue, the "pokybuild3" user on all
+systems must have the same UID/GID, or sufficient permissions must be
+granted on the /srv/autobuilder path (or wherever you modified the config
+files to point to). The following instructions assume a controller node
+at 147.11.105.72 and a single worker node at 147.11.105.71, but
+additional worker nodes can be added as needed (see the previous
+section).
+
+1. On the NFS host:
+
+```
+sudo apt install -y nfs-kernel-server
+sudo mkdir -p /srv/autobuilder/autobuilder.yoctoproject.org/
+sudo chown -R pokybuild3:pokybuild3 /srv/autobuilder/autobuilder.yoctoproject.org
+```
+2. Add the following to /etc/exports, replacing the path and IP fields
+ as necessary for each client node:
+```
+/srv/autobuilder/autobuilder.yoctoproject.org/ 147.11.105.71(rw,sync,no_subtree_check)
+```
+
+3. Run
+```
+sudo systemctl restart nfs-kernel-server
+```
+
+4. Adjust the firewall (if required). Example:
+```
+sudo ufw allow from 147.11.105.71 to any port nfs
+```
+
+5. On the client node(s):
+```
+sudo apt-get install nfs-common
+sudo mount 147.11.105.72:/srv/autobuilder/autobuilder.yoctoproject.org/ /srv/autobuilder/autobuilder.yoctoproject.org/
+```
+
## 2) Basics
This section is an overview of operation and a few basic configuration file relationships. See Section 3 for more detailed instructions.
|
|
[yocto-autobuilder2][PATCH] README-Guide.md: Add multi-node content, extra config info
The instructions in README-Guide.md are a good starting point, but there are some additional guidelines in this patch for setting up worker nodes which may be useful to others who want to run their own Autobuilder instance. Specifically, it adds: - Section 1.3 on adding additional worker nodes to a cluster - Section 1.4 on setting up an NFS share for the controller and workers to reference - A link to the Yocto Manual where the requirements to support running builds on Ubuntu/Debian systems is listed - A note to make sure that any new users (pokybuild3) created for the Autobuilder have LANG set in their bash profile Signed-off-by: Trevor Gamblin <trevor.gamblin@...> --- README-Guide.md | 86 ++++++++++++++++++++++++++++++++++++++++++++++++- 1 file changed, 85 insertions(+), 1 deletion(-) diff --git a/README-Guide.md b/README-Guide.md index 21dd7c1..d976fdd 100644 --- a/README-Guide.md +++ b/README-Guide.md @@ -6,7 +6,8 @@ This guide will walk through how to install a stand-alone autobuilder controller The final outputs of this section are a controller and worker installed in the same server, ready for trimming back to an individual organization's needs. - > NOTE: The guide assumes that your host OS has the packages installed to support BitBake for the release(s) you are targeting. Please refer to the Yocto manual for those packages. + > NOTE: The guide assumes that your host OS has the packages installed to support BitBake for the release(s) you are targeting. Please refer to the Yocto manual for those packages: + https://docs.yoctoproject.org/ref-manual/system-requirements.html#ubuntu-and-debian The latest version of BuildBot is written in Python 3, so installation via pip3: @@ -43,6 +44,14 @@ yocto-controller/yoctoabb yocto-worker ``` +Before proceeding, make sure that the following is added to the +pokybuild3 user's exports (e.g. in .bashrc), or builds will fail after +being triggered: + +``` +export LANG=en_US.UTF-8 +``` + Next, we need to update the `yocto-controller/yoctoabb/master.cfg` towards the bottom where the `title`, `titleURL`, and `buildbotURL` are all set. This is also where you would specify a different password for binding workers to the master. Then, we need to update the `yocto-controller/yoctoabb/config.py` to include our worker. In that file, find the line where `workers` is set and add: ["example-worker"]. _NOTE:_ if your worker's name is different, use that here. Section 3.1 discusses how to further refine this list of workers. @@ -112,6 +121,81 @@ sudo /home/pokybuild3/yocto-worker/qemuarm/build/scripts/runqemu-gen-tapdevs \ In the above command, we assume the a build named qemuarm failed. The value of 8 is the number of tap interfaces to create on the worker. +### 1.3) Adding Dedicated Worker Nodes + +Running both the controller and the worker together on a single machine +can quickly result in long build times and an unresponsive web UI, +especially if you plan on running any of the more comprehensive builders +(e.g. a-full). Additional workers can be added to the cluster by +following the steps in Section 1, except that the yocto-controller steps +do not need to be repeated. For example, to add a new worker +"ala-blade51" to an Autobuilder cluster with a yocto-controller at the +IP address 147.11.105.72: + +1. On the yocto-controller host, add the name of the new worker to a worker +list (or create a new one) e.g. 'workers_wrlx = ["ala-blade51"]' and +make sure that it is added to the "workers" list. + +2. On the new worker node: + +``` +useradd -m --system pokybuild3 +cd /home/pokybuild3 +mkdir -p git/trash +buildbot-worker create-worker -r --umask=0o22 yocto-worker 147.11.105.72 ala-blade51 pass +chown -R pokybuild3:pokybuild3 /home/pokybuild3 +``` + + > Note 1: The URL/IP given to the create-worker command must match the +host running the yocto-controller. + + > Note 2: The "pass" argument given to the create-worker command must +match the common "worker_pass" variable set in yocto-controller/yoctoabb/config.py. + + +### 1.4) Configuring NFS for the Autobuilder Cluster + +The Yocto Autobuilder relies on NFS to distribute a common sstate cache +and other outputs between nodes. A similar configuration can be +deployed by performing the steps given below, which were written for +Ubuntu 18.04.In order for both the controller and worker nodes to be able +to access the NFS share without issue, the "pokybuild3" user on all +systems must have the same UID/GID, or sufficient permissions must be +granted on the /srv/autobuilder path (or wherever you modified the config +files to point to). The following instructions assume a controller node +at 147.11.105.72 and a single worker node at 147.11.105.71, but +additional worker nodes can be added as needed (see the previous +section). + +1. On the NFS host: + +``` +sudo apt install -y nfs-kernel-server +sudo mkdir -p /srv/autobuilder/autobuilder.yoctoproject.org/ +sudo chown -R pokybuild3:pokybuild3 /srv/autobuilder/autobuilder.yoctoproject.org +``` +2. Add the following to /etc/exports, replacing the path and IP fields + as necessary for each client node: +``` +/srv/autobuilder/autobuilder.yoctoproject.org/ 147.11.105.71(rw,sync,no_subtree_check) +``` + +3. Run +``` +sudo systemctl restart nfs-kernel-server +``` + +4. Adjust the firewall (if required). Example: +``` +sudo ufw allow from 147.11.105.71 to any port nfs +``` + +5. On the client node(s): +``` +sudo apt-get install nfs-common +sudo mount 147.11.105.72:/srv/autobuilder/autobuilder.yoctoproject.org/ /srv/autobuilder/autobuilder.yoctoproject.org/ +``` + ## 2) Basics This section is an overview of operation and a few basic configuration file relationships. See Section 3 for more detailed instructions. -- 2.30.2
|
|
Re: [yocto-autobuilder2][RFC][PATCH] README-Guide.md: Add multi-node content, extra config info
On 2021-04-05 2:55 p.m., Michael
Halstead wrote:
[Please note: This e-mail
is from an EXTERNAL e-mail address]
Signed-off-by: Trevor Gamblin <trevor.gamblin@...>
---
README-Guide.md | 94
+++++++++++++++++++++++++++++++++++++++++++++++++
1 file changed, 94 insertions(+)
diff --git a/README-Guide.md b/README-Guide.md
index 21dd7c1..8558c48 100644
--- a/README-Guide.md
+++ b/README-Guide.md
@@ -43,6 +43,16 @@ yocto-controller/yoctoabb
yocto-worker
```
+Before proceeding, make sure that the following is added
to the
+pokybuild3 user's exports (e.g. in .bashrc), or builds
will fail after
+being triggered:
+
+```
+export LC_ALL=en_US.UTF-8
+export LANG=en_US.UTF-8
+export LANGUAGE=en_US.UTF-8
+```
On the AB at typhoon.yocto.io only
LANG=en_US.UTF-8 is set. I don't know why LC_ALL or
LANGUAGE need to be set on your cluster for builds to
succeed.
You're right, I don't need the others. Fixing this for the next
revision.
+
Next, we need to update
the `yocto-controller/yoctoabb/master.cfg` towards the
bottom where the `title`, `titleURL`, and `buildbotURL`
are all set. This is also where you would specify a
different password for binding workers to the master.
Then, we need to update
the `yocto-controller/yoctoabb/config.py` to include our
worker. In that file, find the line where `workers` is
set and add: ["example-worker"]. _NOTE:_ if your worker's
name is different, use that here. Section 3.1 discusses
how to further refine this list of workers.
@@ -112,6 +122,90 @@ sudo
/home/pokybuild3/yocto-worker/qemuarm/build/scripts/runqemu-gen-tapdevs
\
In the above command, we assume the a build named qemuarm
failed. The value of 8 is the number of tap interfaces to
create on the worker.
+### 1.3) Adding Dedicated Worker Nodes
+
+Running both the controller and the worker together on a
single machine
+can quickly result in long build times and an
unresponsive web UI,
+especially if you plan on running any of the more
comprehensive builders
+(e.g. a-full). Additional workers can be added to the
cluster by
+following the steps given above, except that the
yocto-controller steps
+do not need to be repeated. For example, to add a new
worker
+"ala-blade51" to an Autobuilder cluster with a
yocto-controller at the
+IP address 147.11.105.72:
+
+1. On the yocto-controller host, add the name of the new
worker to a worker
+list (or create a new one) e.g. 'workers_wrlx =
["ala-blade51"]' and
+make sure that it is added to the "workers" list.
+
+2. On the new worker node:
+
+```
+sudo apt-get install gawk wget git-core diffstat unzip
texinfo \
+gcc-multilib build-essential chrpath socat cpio python
python3 \
+python3-pip python3-pexpect xz-utils debianutils
iputils-ping \
+libsdl1.2-dev xterm
The beginning of README-Guide.md mentions that the user should
reference the Yocto Manual for the required packages, so maybe
copying the list here is inconsistent. I'll put the link near the
top of the doc and we can look at a better way to do this if/when a
new version of this guide makes it into the Manual.
+
+sudo pip3 install buildbot buildbot-www
buildbot-waterfall-view \
+buildbot-console-view buildbot-grid-view buildbot-worker
+
+useradd -m --system pokybuild3
+cd /home/pokybuild3
+mkdir -p git/trash
+buildbot-worker create-worker -r --umask=0o22
yocto-worker 147.11.105.72 ala-blade51 pass
+chown -R pokybuild3:pokybuild3 /home/pokybuild3
+```
+
+ > Note 1: The URL/IP given to the create-worker
command must match the
+host running the yocto-controller.
+
+ > Note 2: The "pass" argument given to the
create-worker command must
+match the common "worker_pass" variable set in
yocto-controller/yoctoabb/config.py.
+
+
+### 1.4) Configuring NFS for the Autobuilder Cluster
+
+The Yocto Autobuilder relies on NFS to distribute a
common sstate cache
+and other outputs between nodes. A similar configuration
can be
+deployed by performing the steps given below, which were
written for
+Ubuntu 18.04.In
order for both the controller and worker nodes to be able
+to access the NFS share without issue, the "pokybuild3"
user on all
+systems must have the same UID/GID, or sufficient
permissions must be
+granted on the /srv/autobuilder path (or wherever you
modified the config
+files to point to). The following instructions assume a
controller node
+at 147.11.105.72 and a single worker node at
147.11.105.71, but
+additional worker nodes can be added as needed (see the
previous
+section).
+
+1. On the NFS host:
+
+```
+sudo apt install -y nfs-kernel-server
+sudo mkdir -p /srv/autobuilder/autobuilder.yoctoproject.org/pub/sstate
+sudo chown -R pokybuild3:pokybuild3 /srv
Let's only chown the directories we intend to export.
Other data may be present in /srv and leaving its owner
intact is desirable.
Good point. Fixing this for the next patch.
Thanks again for your review!
- Trevor
+```
+2. Add the following to /etc/exports, replacing the path
and IP fields
+ as necessary for each client node:
+```
+/srv/autobuilder/autobuilder.yoctoproject.org/pub/sstate
147.11.105.71(rw,sync,no_subtree_check)
+```
+
+3. Run
+```
+sudo systemctl restart nfs-kernel-server
+```
+
+4. Adjust the firewall (if required). Example:
+```
+sudo ufw allow from 147.11.105.71 to any port nfs
+```
+
+5. On the client node(s):
+```
+sudo mkdir -p /srv/autobuilder/autobuilder.yoctoproject.org/pub/sstate
+sudo chown -R pokybuild3:pokybuild3 /srv/autobuilder/
+sudo mount 147.11.105.72:/srv/autobuilder/autobuilder.yoctoproject.org/pub/sstate
/srv/autobuilder/autobuilder.yoctoproject.org/pub/sstate
+```
+
## 2) Basics
This section is an overview of operation and a few basic
configuration file relationships. See Section 3 for more
detailed instructions.
--
2.30.2
--
Michael Halstead
Linux Foundation / Yocto Project
Systems Operations Engineer
|
|
is there a *compelling* use case for "FILESEXTRAPATHS_append"?
over the years, i've always been uncomfortable with the admittedly small number of examples in various layers i've found that append to FILESEXTRAPATHS rather than prepend, so i'm curious if there is an actual, persuasive reason to ever do that.
philosophically, when one uses FILESEXTRAPATHS_prepend, one is effectively saying, "here, use my content. nobody else's. mine. it's right here. you can see it." and there should be no ambiguity, of course.
when you append, however, you're allowing for some hitherto unknown earlier layer to jump in and override some of your content. effectively, it seems (and i'm willing to be corrected) that you're saying, "i have some content that *could* be used, but if some earlier layer has the same content, we'll use that instead."
that seems to open up a huge can of non-determinism, if an earlier layer suddenly introduces (or, conversely removes) content of the same name without you noticing, and weird things start to happen.
so the question is, is there a really, well-defined use case for appending to this variable?
rday
|
|
EXT SDK, aarch64, zeus…
How do I best modify the EXT SDK build env so as to reference the “vivado” env dependencies ?
Is there a way to build in support so that the “module load” command would be usable from the “.conf” or the env-setup script, (i.e. “module load vivado…” ?
Thanks,
Steve
toggle quoted messageShow quoted text
From: Monsees, Steven C (US)
Sent: Thursday, April 1, 2021 6:52 AM
To: 'Khem Raj' <raj.khem@...>
Cc: yocto@...
Subject: RE: [yocto] #yocto #sdk
Thanks for your patience…
I was able to followed your advice and I am now able to build and install the Extended SDK for our Intel platform…
Can you tell me when building for Arm/Xilinx based platforms, how you incorporate the dependency of the low level Xilinx FPGA support on “Vivado” into the Ext
SDK build env ?
Thanks,
Steve
From: Khem Raj <raj.khem@...>
Sent: Thursday, March 25, 2021 5:16 PM
To: Monsees, Steven C (US) <steven.monsees@...>
Cc: yocto@...
Subject: Re: [yocto] #yocto #sdk
External Email Alert
|
This email has been sent from an account outside of the BAE Systems network.
Please treat the email with caution, especially if you are requested to click on a link, decrypt/open an attachment, or enable macros. For further information on how to spot phishing, access “Cybersecurity OneSpace Page” and report
phishing by clicking the button “Report Phishing” on the Outlook toolbar.
|
I've been looking at this but still find it odd that they are all " virtual:native"/ "poky/meta"/“do_populate_sysroot” related...
It is a "minimum" plus "toolset" build... and it builds clean, yet fails on the install...
The error: "> ERROR: Task quilt-native.do_fetch attempted to execute unexpectedly"
How do you determine unexpected execution ?
Any suggestions on how I should approach this ?
Perhaps get into install env and do signatures check for this task
Thanks,
Steve
-----Original Message-----
From: Monsees, Steven C (US)
Sent: Wednesday, March 24, 2021 2:43 PM
To: 'Khem Raj' <raj.khem@...>
Cc: yocto@...
Subject: RE: [yocto] #yocto #sdk
The output you see is from setting:
SDK_EXT_TYPE = "minimal"
SDK_INCLUDE_TOOLCHAIN = "1"
When building minimal only, there are no errors/warnings (and no tools...)
-----Original Message-----
From: Khem Raj <raj.khem@...>
Sent: Wednesday, March 24, 2021 2:35 PM
To: Monsees, Steven C (US) <steven.monsees@...>
Cc: yocto@...
Subject: Re: [yocto] #yocto #sdk
External Email Alert
This email has been sent from an account outside of the BAE Systems network.
Please treat the email with caution, especially if you are requested to click on a link, decrypt/open an attachment, or enable macros. For further information on how to spot phishing, access “Cybersecurity OneSpace Page” and report phishing by clicking the
button “Report Phishing” on the Outlook toolbar.
I think there still are signature differences. perhaps try to add incremntally on top of minimal sdk and see where it breaks.
On 3/24/21 9:18 AM, Monsees, Steven C (US) via
lists.yoctoproject.org wrote:
> I corrected for the sig warnings, but still have an issue with the
> extended SDK installing correctly
>
> (though I think I am close…)
>
> *Note: The only issue now appears to be around the “…/poky/meta”
> layer… and all with regards to “do_populate_sysroot” task…*
>
> I am building my kernel clean, and update the MIRRORS after…
>
> The unihash & taskhash values are identical with respect to each
> component below…
>
> I am building “uninative” support into the EXT SDK only…
>
> *None of the poky/meta references below are being modified by
> bbappends… should be a straight build*…
>
> The EXT SDK local.conf appears to be setup correctly for my build env…
>
> Am I missing something, a required variable setting, an additional
> support component ? *- seems odd it is all centered around the one
> unmodified layer…*
>
> I am able to build and install the “minimum” EXT SDK correctly, but I
> of course need the toolset…
>
> I would appreciate any advice on how I might resolve this issue.
>
> Install Output:
>
> 10:50 smonsees@yix490016
> /disk0/scratch/smonsees/yocto/workspace_3/builds2/sbcb-default/tmp/dep
> loy/sdk>ls
>
> limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.host.mani
> fest
>
>
limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.sh
>
>
limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.target.ma
> nifest
>
> limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.testdata.
> json
>
> x86_64-buildtools-nativesdk-standalone-3.0.4.host.manifest
>
>
x86_64-buildtools-nativesdk-standalone-3.0.4.sh
>
> x86_64-buildtools-nativesdk-standalone-3.0.4.target.manifest
>
> x86_64-buildtools-nativesdk-standalone-3.0.4.testdata.json
>
> 10:50 smonsees@yix490016
> /disk0/scratch/smonsees/yocto/workspace_3/builds2/sbcb-default/tmp/dep
> loy/sdk>
> ./limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.sh
>
> LIMWS (BAE LIMWS base distro) Extensible SDK installer version 3.0.4
>
> ====================================================================
>
> Enter target directory for SDK (default: ~/limws_sdk):
> /disk0/scratch/smonsees/sbcbSDK_EXT
>
> You are about to install the SDK to
> "/disk0/scratch/smonsees/sbcbSDK_EXT". Proceed [Y/n]? Y
>
> Extracting SDK...............done
>
> Setting it up...
>
> Extracting buildtools...
>
> Preparing build system...
>
> Parsing recipes: 100%
> |#####################################################################
> |########################|
> Time: 0:01:33
>
> Initialising tasks: 100%
> |#####################################################################
> |#####################|
> Time: 0:00:00
>
> Checking sstate mirror object availability: 100%
> |##################################################################|
> Time: 0:00:00
>
> ERROR: Task quilt-native.do_fetch attempted to execute unexpectedly
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> gcc/libgcc_9.2.bb:do_populate_sysroot,
> unihash
> d5a9dff48660903403f33fe67d6d43e03c97c03232c6d8f0ed71f99a94670bce,
> taskhash
> d5a9dff48660903403f33fe67d6d43e03c97c03232c6d8f0ed71f99a94670bce
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/gmp/gmp_6.1.2.bb:do_populate_sysroot,
> unihash
> cde9ef4fc769ee9a2733a1023534c15bfe199009270bcebb6c24c638729194dc,
> taskhash
> cde9ef4fc769ee9a2733a1023534c15bfe199009270bcebb6c24c638729194dc
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> libtool/libtool-native_2.4.6.bb:do_populate_sysroot,
> unihash
> a1def57d3e655defdf1f85eec749be672ffe52a0a3c247585da9d6c57617cca2,
> taskhash
> a1def57d3e655defdf1f85eec749be672ffe52a0a3c247585da9d6c57617cca2
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> gcc/gcc-cross_9.2.bb:do_populate_sysroot,
> unihash
> 5f0f3533314c754b184e6f63f11ef2b570c7a5d47bc18fee2b4217aa294f08eb,
> taskhash
> 5f0f3533314c754b184e6f63f11ef2b570c7a5d47bc18fee2b4217aa294f08eb
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-connectivity/openssl/openssl_1.1.1g.bb:do_populate_sysroot,
> unihash
> d5e6bedb0cfb876a2925ea2e7f3bd00b090326b1cebf1182a6322974a6f055a3,
> taskhash
> d5e6bedb0cfb876a2925ea2e7f3bd00b090326b1cebf1182a6322974a6f055a3
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/python/python3_3.7.8.bb:do_populate_sysroot,
> unihash
> 8ee0c0eafd3b1c3f774a26f59659fc0c563816b6badfa57d9fa9097a182b1de5,
> taskhash
> 8ee0c0eafd3b1c3f774a26f59659fc0c563816b6badfa57d9fa9097a182b1de5
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-gnome/gtk-doc/gtk-doc_1.31.bb:do_populate_sysroot,
> unihash
> fbc7421c8a324ed0cbca81f98430f509ce4cf6593b0961cad8109d467df9e35e,
> taskhash
> fbc7421c8a324ed0cbca81f98430f509ce4cf6593b0961cad8109d467df9e35e
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/meta
> /meta-extsdk-toolchain.bb:do_populate_sysroot,
> unihash
> b9d46f79061ad82c4630a3db00aefe484f743a84a526e8afb24d953d04752276,
> taskhash
> b9d46f79061ad82c4630a3db00aefe484f743a84a526e8afb24d953d04752276
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/attr/attr_2.4.47.bb:do_populate_sysroot,
> unihash
> 3a6c84cf03e3103e46c02b01aed446fc31617f348b40d9e51b5b2ee8c2f3d0ee,
> taskhash
> 3a6c84cf03e3103e46c02b01aed446fc31617f348b40d9e51b5b2ee8c2f3d0ee
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/libmpc/libmpc_1.1.0.bb:do_populate_sysroot,
> unihash
> 39109487309272ea510afb753a0dd84775625c73f7a261b9d0078fe0ea718f17,
> taskhash
> 39109487309272ea510afb753a0dd84775625c73f7a261b9d0078fe0ea718f17
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-core/util-linux/util-linux_2.34.bb:do_populate_sysroot,
> unihash
> 51964ba6ff2cd62ad6d9077e9fddfe53be566eb23beca10e9c882a1eee20aa5d,
> taskhash
> 51964ba6ff2cd62ad6d9077e9fddfe53be566eb23beca10e9c882a1eee20aa5d
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-multimedia/libpng/libpng_1.6.37.bb:do_populate_sysroot,
> unihash
> 6d92093db77054a96cd23e00ca2bf3468a9ae8ebddc191a59e1a0136778d6be1,
> taskhash
> 6d92093db77054a96cd23e00ca2bf3468a9ae8ebddc191a59e1a0136778d6be1
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> gcc/gcc-cross_9.2.bb:do_gcc_stash_builddir,
> unihash
> 62ba54c4db5ba11db400ba0277892d92f665f35b5c334c17f8e6ad9ded9c16b1,
> taskhash
> 62ba54c4db5ba11db400ba0277892d92f665f35b5c334c17f8e6ad9ded9c16b1
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-extended/xz/xz_5.2.4.bb:do_populate_sysroot,
> unihash
> 01723d04843fdbeec3fabd109c34281bd49c0979e09c722b2c189335cb6c957a,
> taskhash
> 01723d04843fdbeec3fabd109c34281bd49c0979e09c722b2c189335cb6c957a
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> m4/m4-native_1.4.18.bb:do_populate_sysroot,
> unihash
> 19b266239a8f93f5273ac6213d0f58a73bfc1ecbe84c5cfd273f5351b0740ca1,
> taskhash
> 19b266239a8f93f5273ac6213d0f58a73bfc1ecbe84c5cfd273f5351b0740ca1
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-graphics/xorg-lib/pixman_0.38.4.bb:do_populate_sysroot,
> unihash
> 66cca6669fc3fdc571970b1ccabb7a8b334139013df8b71c8b033d15705ec5a7,
> taskhash
> 66cca6669fc3fdc571970b1ccabb7a8b334139013df8b71c8b033d15705ec5a7
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/unfs3/unfs3_git.bb:do_populate_sysroot,
> unihash
> 46e3dd7e07935b77a618c4587f5bc8dbaaff1ba030e779683e2bf2679f57c8fb,
> taskhash
> 46e3dd7e07935b77a618c4587f5bc8dbaaff1ba030e779683e2bf2679f57c8fb
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> gcc/gcc-runtime_9.2.bb:do_populate_sysroot,
> unihash
> 7200138112d31332099cf647ee83441c6739d6f276f2ba859bd440b7a4eed9fb,
> taskhash
> 7200138112d31332099cf647ee83441c6739d6f276f2ba859bd440b7a4eed9fb
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/meson/meson_0.51.2.bb:do_populate_sysroot,
> unihash
> ac801ce28f4bf45c7c08e2721a765872a1bd6561f783c570ed47dad7e9642901,
> taskhash
> ac801ce28f4bf45c7c08e2721a765872a1bd6561f783c570ed47dad7e9642901
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-core/expat/expat_2.2.8.bb:do_populate_sysroot,
> unihash
> c47a5a2b37341edbfeab516b931c8f0015b52d6159f251e70f57e086a6502fe1,
> taskhash
> c47a5a2b37341edbfeab516b931c8f0015b52d6159f251e70f57e086a6502fe1
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/bison/bison_3.4.1.bb:do_populate_sysroot,
> unihash
> f8fb4d2026cb4192c03bc75c357f9890dcb4f7593d23407f9a60c32d383d7c57,
> taskhash
> f8fb4d2026cb4192c03bc75c357f9890dcb4f7593d23407f9a60c32d383d7c57
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-kernel/dtc/dtc_1.5.1.bb:do_populate_sysroot,
> unihash
> 8ee1e9314ae7a6235f2ec876f7d30336d6e65d7879ac17cd1044ac3f20f969ec,
> taskhash
> 8ee1e9314ae7a6235f2ec876f7d30336d6e65d7879ac17cd1044ac3f20f969ec
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/autoconf-archive/autoconf-archive_2019.01.06.bb:do_popu
> late_sysroot,
> unihash
> 7aaaf6c0cf3a9c104029683b93a62b965e91827c487ee707a23c84560aea1d3e,
> taskhash
> 7aaaf6c0cf3a9c104029683b93a62b965e91827c487ee707a23c84560aea1d3e
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-extended/bzip2/bzip2_1.0.8.bb:do_populate_sysroot,
> unihash
> 66c8139add58f12cae0334108b226f4f91f1fdb34fd34822c9ff9612d6c11b64,
> taskhash
> 66c8139add58f12cae0334108b226f4f91f1fdb34fd34822c9ff9612d6c11b64
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-graphics/xorg-util/util-macros_1.19.2.bb:do_populate_sysroot,
> unihash
> 070d343bb7de5e6402f4190283e6d40ca33031eac71601d7ab92a92ef0e175d0,
> taskhash
> 070d343bb7de5e6402f4190283e6d40ca33031eac71601d7ab92a92ef0e175d0
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/python/python3-setuptools_41.2.0.bb:do_populate_sysroot
> ,
> unihash
> e8771b3e23f0d5c3e799b093dd9657a2fd863abf459fa500399930111a8fd388,
> taskhash
> e8771b3e23f0d5c3e799b093dd9657a2fd863abf459fa500399930111a8fd388
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> qemu/qemu-system-native_4.1.0.bb:do_populate_sysroot,
> unihash
> 33ac287a8d8aded61eb77dd21cb3c54986126430c78a243f706a5917ef0a0183,
> taskhash
> 33ac287a8d8aded61eb77dd21cb3c54986126430c78a243f706a5917ef0a0183
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/mpfr/mpfr_4.0.2.bb:do_populate_sysroot,
> unihash
> 25d61942ed599e037b2e75a5b722ce5ff251005c2a4ee23e9faef34c9e54777b,
> taskhash
> 25d61942ed599e037b2e75a5b722ce5ff251005c2a4ee23e9faef34c9e54777b
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/re2c/re2c_1.0.1.bb:do_populate_sysroot,
> unihash
> 6ebe8680a921a8927ef6cd0061b2b50667bb787be010c8ee4ca6ccc3593024b7,
> taskhash
> 6ebe8680a921a8927ef6cd0061b2b50667bb787be010c8ee4ca6ccc3593024b7
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/pseudo/pseudo_git.bb:do_populate_sysroot,
> unihash
> 28e64747a95953ec8626d3027958e12d1fd854a7615bc69cf5adbbc3d49c323a,
> taskhash
> 28e64747a95953ec8626d3027958e12d1fd854a7615bc69cf5adbbc3d49c323a
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-extended/libtirpc/libtirpc_1.1.4.bb:do_populate_sysroot,
> unihash
> 147f1ca7d20e89f2786b48fcda4ebaf36c1c3d941b53b0b8b56c42beb9220c1d,
> taskhash
> 147f1ca7d20e89f2786b48fcda4ebaf36c1c3d941b53b0b8b56c42beb9220c1d
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> qemu/qemu-native_4.1.0.bb:do_populate_sysroot,
> unihash
> 00651d4d53b4b7b10e44770326d5f0a1f5482c1262671621523ba12c21508977,
> taskhash
> 00651d4d53b4b7b10e44770326d5f0a1f5482c1262671621523ba12c21508977
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-core/gettext/gettext_0.19.8.1.bb:do_populate_sysroot,
> unihash
> bf9b767f8e30be92fa06079f2e7350aa304648b0d113829d315e6cb64bad0565,
> taskhash
> bf9b767f8e30be92fa06079f2e7350aa304648b0d113829d315e6cb64bad0565
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/glib
> c/glibc_2.30.bb:do_stash_locale,
> unihash
> d64e054d019028151912ffface31585789df48f4de7e3a66b201cd614c2f4aca,
> taskhash
> d64e054d019028151912ffface31585789df48f4de7e3a66b201cd614c2f4aca
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/ninja/ninja_1.9.0.bb:do_populate_sysroot,
> unihash
> ab3ecdf2561adc51338d36576f60eab1e05fc09ed69bb6444075d7adbeb57b9e,
> taskhash
> ab3ecdf2561adc51338d36576f60eab1e05fc09ed69bb6444075d7adbeb57b9e
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/sqlite/sqlite3_3.29.0.bb:do_populate_sysroot,
> unihash
> c1a988a16d4368098e178f7fe5f0e2e5f8adf4fa485a7b79c4c093a38005264e,
> taskhash
> c1a988a16d4368098e178f7fe5f0e2e5f8adf4fa485a7b79c4c093a38005264e
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/automake/automake_1.16.1.bb:do_populate_sysroot,
> unihash
> ad223f3318940531fa279bd74480cd6410abc46644f8fe98f7399a71cfe09179,
> taskhash
> ad223f3318940531fa279bd74480cd6410abc46644f8fe98f7399a71cfe09179
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/patch/patch_2.7.6.bb:do_populate_sysroot,
> unihash
> be5aa9a356c12c9b4220c3d3d6dfe16c737e9be88e7d331c0511b275e4d603c4,
> taskhash
> be5aa9a356c12c9b4220c3d3d6dfe16c737e9be88e7d331c0511b275e4d603c4
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/flex/flex_2.6.0.bb:do_populate_sysroot,
> unihash
> 9c37027658f2832321efe3657d91f29d1bf286ad1fda0c9916b256adfa246455,
> taskhash
> 9c37027658f2832321efe3657d91f29d1bf286ad1fda0c9916b256adfa246455
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-core/readline/readline_8.0.bb:do_populate_sysroot,
> unihash
> 3d909d0d6de7cf72b631aa1805efc1147459bef5bddca5f60ff07022ba777e0e,
> taskhash
> 3d909d0d6de7cf72b631aa1805efc1147459bef5bddca5f60ff07022ba777e0e
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-extended/libnsl/libnsl2_git.bb:do_populate_sysroot,
> unihash
> 19357ca137093c4e1e063d14a0d3844f889dce933a4eebdc34acf0c321d707ec,
> taskhash
> 19357ca137093c4e1e063d14a0d3844f889dce933a4eebdc34acf0c321d707ec
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/glib
> c/glibc_2.30.bb:do_populate_sysroot,
> unihash
> df6ecc8017c1a3fa278fc743c85fa6049465da674f169777b9a544eb423b84b5,
> taskhash
> df6ecc8017c1a3fa278fc743c85fa6049465da674f169777b9a544eb423b84b5
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/gdbm/gdbm_1.18.1.bb:do_populate_sysroot,
> unihash
> 8b0d7a859afc0cc39a32d26b8d5c79b5c1b8970a8e5d566098ff59fc916335f5,
> taskhash
> 8b0d7a859afc0cc39a32d26b8d5c79b5c1b8970a8e5d566098ff59fc916335f5
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/libcap-ng/libcap-ng_0.7.9.bb:do_populate_sysroot,
> unihash
> 784e3c4b04d227379d94e85251233a568fb9e9f841d737584882d0da0b009d5c,
> taskhash
> 784e3c4b04d227379d94e85251233a568fb9e9f841d737584882d0da0b009d5c
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-core/zlib/zlib_1.2.11.bb:do_populate_sysroot,
> unihash
> 770d0b4be83a17d65464ade3adc3c6be443a9f8fffbe53d303c5765674a274d7,
> taskhash
> 770d0b4be83a17d65464ade3adc3c6be443a9f8fffbe53d303c5765674a274d7
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-extended/unzip/unzip_6.0.bb:do_populate_sysroot,
> unihash
> 82d365cde8a3375461fb47f650aa3fd7c8aa029b0cd2f23ccd38b6f73a9902d9,
> taskhash
> 82d365cde8a3375461fb47f650aa3fd7c8aa029b0cd2f23ccd38b6f73a9902d9
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/pkgconfig/pkgconfig_git.bb:do_populate_sysroot,
> unihash
> de3b4482bf2a0878b99c904fecac19e917d374838da4c9df62929bb14d1282d1,
> taskhash
> de3b4482bf2a0878b99c904fecac19e917d374838da4c9df62929bb14d1282d1
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> binutils/binutils-cross_2.32.bb:do_populate_sysroot,
> unihash
> 50ce76092848b0214480dd7a4f0fcc7e5927f4f8071601bc094847d20d2c879d,
> taskhash
> 50ce76092848b0214480dd7a4f0fcc7e5927f4f8071601bc094847d20d2c879d
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/gnu-config/gnu-config_git.bb:do_populate_sysroot,
> unihash
> 90db72e6ab74de51a86e0b14980b2c204076fc3ef8297a374b660d8645853cac,
> taskhash
> 90db72e6ab74de51a86e0b14980b2c204076fc3ef8297a374b660d8645853cac
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-kernel/li
> nux-libc-headers/linux-libc-headers_5.2.bb:do_populate_sysroot,
> unihash
> 7b6f6e59c3431987b308c78d6f72e5aefae1b9afbf158a47540f0db5e04ebdb0,
> taskhash
> 7b6f6e59c3431987b308c78d6f72e5aefae1b9afbf158a47540f0db5e04ebdb0
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> gdb/gdb-cross_8.3.1.bb:do_populate_sysroot,
> unihash
> c623832386a7201b2a59b170e7c9015edfffbfb21dbec6ab44e81662d1d7c504,
> taskhash
> c623832386a7201b2a59b170e7c9015edfffbfb21dbec6ab44e81662d1d7c504
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> quilt/quilt-native_0.66.bb:do_populate_sysroot,
> unihash
> 23290d029e88d49579ce286326ba82d42ad77874a2cd0e05e71166b964190822,
> taskhash
> 23290d029e88d49579ce286326ba82d42ad77874a2cd0e05e71166b964190822
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/libffi/libffi_3.3~rc0.bb:do_populate_sysroot,
> unihash
> 5be2fdefd4b14100290247d24d2df8da234ea32cb91e4508ffd793aabc06d30e,
> taskhash
> 5be2fdefd4b14100290247d24d2df8da234ea32cb91e4508ffd793aabc06d30e
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/make/make_4.2.1.bb:do_populate_sysroot,
> unihash
> 7a82e867fd7be399f5d92200e43de6e7d9d42ad98e5f771a6e54a0975053ae2e,
> taskhash
> 7a82e867fd7be399f5d92200e43de6e7d9d42ad98e5f771a6e54a0975053ae2e
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-extended/
> texinfo-dummy-native/texinfo-dummy-native.bb:do_populate_sysroot,
> unihash
> 2d20a98fe86b071366643317507293df9594c15528ef49f3fbeeffe4af532501,
> taskhash
> 2d20a98fe86b071366643317507293df9594c15528ef49f3fbeeffe4af532501
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/gett
> ext/gettext-minimal-native_0.19.8.1.bb:do_populate_sysroot,
> unihash
> d579308c5efa4cef283785d540731bf0f02dffeef6ea677b0fa7cec6332e7902,
> taskhash
> d579308c5efa4cef283785d540731bf0f02dffeef6ea677b0fa7cec6332e7902
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-core/glib-2.0/glib-2.0_2.60.7.bb:do_populate_sysroot,
> unihash
> b7ff5dcd7278fab62aa716be6cf652bcc1d463d884738fb3232297fe6f81880a,
> taskhash
> b7ff5dcd7278fab62aa716be6cf652bcc1d463d884738fb3232297fe6f81880a
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-extended/gperf/gperf_3.1.bb:do_populate_sysroot,
> unihash
> 6765ae416e5360039914d6216c0d02541c5afc070545804303d75d1016b7b460,
> taskhash
> 6765ae416e5360039914d6216c0d02541c5afc070545804303d75d1016b7b460
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-core/ncurses/ncurses_6.1+20190803.bb:do_populate_sysroot,
> unihash
> f468831b3be537588a35b7fdf2e1a46dc52d1737fbf168c0e83ff0f162a99cf9,
> taskhash
> f468831b3be537588a35b7fdf2e1a46dc52d1737fbf168c0e83ff0f162a99cf9
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-multimedia/alsa/alsa-lib_1.1.9.bb:do_populate_sysroot,
> unihash
> 39d5b05d5ec0e2b2abbb710c7c31f17d3047a255f5a11deb121d7323e06fb900,
> taskhash
> 39d5b05d5ec0e2b2abbb710c7c31f17d3047a255f5a11deb121d7323e06fb900
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-support/libpcre/libpcre_8.43.bb:do_populate_sysroot,
> unihash
> 3eed4e011c853b98bf31e1c1b2eee2073aeb4ef0546c9bd230f2bfcc3ac05088,
> taskhash
> 3eed4e011c853b98bf31e1c1b2eee2073aeb4ef0546c9bd230f2bfcc3ac05088
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/autoconf/autoconf_2.69.bb:do_populate_sysroot,
> unihash
> 373490cc20455b0913b69b35ab9cc61340356d7b27f7ecb6cf51a3ad9459a068,
> taskhash
> 373490cc20455b0913b69b35ab9cc61340356d7b27f7ecb6cf51a3ad9459a068
>
> Task
> virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/re
> cipes-devtools/unifdef/unifdef_2.11.bb:do_populate_sysroot,
> unihash
> 3e6814932d42ab266096948b4b81f9c1fbdbb26f7b990963ca4322a718e13170,
> taskhash
> 3e6814932d42ab266096948b4b81f9c1fbdbb26f7b990963ca4322a718e13170
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> gcc/libgcc-initial_9.2.bb:do_populate_sysroot,
> unihash
> 07136816c5d9bb085d8dab671c1689d08254d92b7e0edbb4a23abb3ae2628bea,
> taskhash
> 07136816c5d9bb085d8dab671c1689d08254d92b7e0edbb4a23abb3ae2628bea
>
> Task
> /disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools/
> qemu/qemu-helper-native_1.0.bb:do_populate_sysroot,
> unihash
> 4ba7e532221d903e4c3556460d09d7bf7eabc9c4ca73f6a481849be0eaba23a3,
> taskhash
> 4ba7e532221d903e4c3556460d09d7bf7eabc9c4ca73f6a481849be0eaba23a3
>
> This is usually due to missing setscene tasks. Those missing in this
> build were:
> {'/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/ge
> ttext/gettext-minimal-native_0.19.8.1.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/gli
> bc/glibc_2.30.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/gli
> bc/glibc_2.30.bb:do_stash_locale',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-core/met
> a/meta-extsdk-toolchain.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /binutils/binutils-cross_2.32.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /gcc/gcc-cross_9.2.bb:do_gcc_stash_builddir',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /gcc/gcc-cross_9.2.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /gcc/gcc-runtime_9.2.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /gcc/libgcc-initial_9.2.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /gcc/libgcc_9.2.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /gdb/gdb-cross_8.3.1.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /libtool/libtool-native_2.4.6.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /m4/m4-native_1.4.18.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /qemu/qemu-helper-native_1.0.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /qemu/qemu-native_4.1.0.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /qemu/qemu-system-native_4.1.0.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /quilt/quilt-native_0.66.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-extended
> /texinfo-dummy-native/texinfo-dummy-native.bb:do_populate_sysroot',
>
> '/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-kernel/l
> inux-libc-headers/linux-libc-headers_5.2.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-connectivity/openssl/openssl_1.1.1g.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-core/expat/expat_2.2.8.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-core/gettext/gettext_0.19.8.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-core/glib-2.0/glib-2.0_2.60.7.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-core/ncurses/ncurses_6.1+20190803.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-core/readline/readline_8.0.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-core/util-linux/util-linux_2.34.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-core/zlib/zlib_1.2.11.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/autoconf-archive/autoconf-archive_2019.01.06.bb:do_pop
> ulate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/autoconf/autoconf_2.69.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/automake/automake_1.16.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/bison/bison_3.4.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/flex/flex_2.6.0.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/gnu-config/gnu-config_git.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/make/make_4.2.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/meson/meson_0.51.2.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/ninja/ninja_1.9.0.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/patch/patch_2.7.6.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/pkgconfig/pkgconfig_git.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/pseudo/pseudo_git.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/python/python3-setuptools_41.2.0.bb:do_populate_sysroo
> t',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/python/python3_3.7.8.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/unfs3/unfs3_git.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-devtools/unifdef/unifdef_2.11.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-extended/bzip2/bzip2_1.0.8.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-extended/gperf/gperf_3.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-extended/libnsl/libnsl2_git.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-extended/libtirpc/libtirpc_1.1.4.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-extended/unzip/unzip_6.0.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-extended/xz/xz_5.2.4.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-gnome/gtk-doc/gtk-doc_1.31.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-graphics/xorg-lib/pixman_0.38.4.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-graphics/xorg-util/util-macros_1.19.2.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-kernel/dtc/dtc_1.5.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-multimedia/alsa/alsa-lib_1.1.9.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-multimedia/libpng/libpng_1.6.37.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/attr/attr_2.4.47.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/gdbm/gdbm_1.18.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/gmp/gmp_6.1.2.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/libcap-ng/libcap-ng_0.7.9.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/libffi/libffi_3.3~rc0.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/libmpc/libmpc_1.1.0.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/libpcre/libpcre_8.43.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/mpfr/mpfr_4.0.2.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/re2c/re2c_1.0.1.bb:do_populate_sysroot',
>
> 'virtual:native:/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/r
> ecipes-support/sqlite/sqlite3_3.29.0.bb:do_populate_sysroot'}
>
> ERROR: Task
> (/disk0/scratch/smonsees/sbcbSDK_EXT/layers/poky/meta/recipes-devtools
> /quilt/quilt-native_0.66.bb:do_fetch)
> failed with exit code 'setscene whitelist'
>
> ERROR: SDK preparation failed: error log written to
> /disk0/scratch/smonsees/sbcbSDK_EXT/preparing_build_system.log
>
> 10:52 smonsees@yix490016
> /disk0/scratch/smonsees/yocto/workspace_3/builds2/sbcb-default/tmp/dep
> loy/sdk>
>
> *From:*Khem Raj <raj.khem@...>
> *Sent:* Thursday, March 4, 2021 1:22 PM
> *To:* Monsees, Steven C (US) <steven.monsees@...>
> *Cc:* yocto@...
> *Subject:* Re: [yocto] #yocto #sdk
>
> *_External Email Alert_*
>
> *This email has been sent from an account outside of the BAE Systems
> network.*
>
> Please treat the email with caution, especially if you are requested
> to click on a link, decrypt/open an attachment, or enable macros. For
> further information on how to spot phishing, access “Cybersecurity
> OneSpace Page” and report phishing by clicking the button “Report
> Phishing” on the Outlook toolbar.
>
> right, the change seems to be happening in task checksums and that
> happens if some of bitbake variables change when SDK is built built
> and when it is being installed ( when it will run parse again )
> perhaps the workspace under the hood is still accessible and you can
> use bitbake-diffsigs to narrow it down the variable that is changing
>
> On Thu, Mar 4, 2021 at 9:38 AM Monsees, Steven C (US) via
> lists.yoctoproject.org <http://lists.yoctoproject.org>
> <steven.monsees=baesystems.com@...
> <mailto:baesystems.com@...>> wrote:
>
> I am seeing similar issues on line for my eSDK install issue, but
> no resolutions…
>
> Can someone advise on best course of action to debug this ?
>
> 11:10 smonsees@yix490016
> /disk0/scratch/smonsees/yocto/workspace_3/builds2/sbcb-default/tmp/deploy/sdk>
> ./limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.sh
>
> <http://limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.s
> h>
>
> LIMWS (BAE LIMWS base distro) Extensible SDK installer version
> 3.0.4
>
>
> ====================================================================
>
> Enter target directory for SDK (default: ~/limws_sdk):
> /disk0/scratch/smonsees/testSDK
>
> You are about to install the SDK to
> "/disk0/scratch/smonsees/testSDK". Proceed [Y/n]? Y
>
> Extracting
>
> SDK...................................................................
> ...........done
>
> Setting it up...
>
> Extracting buildtools...
>
> Preparing build system...
>
> Parsing recipes: 100%
> |##########################################################################################|
> Time: 0:01:36
>
> Initialising tasks: 100%
> |#######################################################################################|
> Time: 0:00:04
>
> Checking sstate mirror object availability: 100%
> |###############################################################|
> Time: 0:00:02
>
> WARNING: The efitools:do_compile sig is computed to be
> 5851605e22907038837428950427053e22ea655641a08b5dafa39d6d6e1c5e15,
> but the sig is locked to
> b81a26e3591c71acd3d22212bfdb70a15a0df49af72e7634e6a39851f16e18b5 in
> SIGGEN_LOCKEDSIGS_t-corei7-64
>
> The monkeysphere:do_install sig is computed to be
> 13a65b26dfff91f2432a8064d98003559eafffa214d81c3c6ea112c2dfba0391,
> but the sig is locked to
> 2058fc9032b0e7f5c1ea358de4fa8d25ccec7204b73ebc636e79222d8cc00469 in
> SIGGEN_LOCKEDSIGS_t-corei7-64
>
> The signature:do_compile sig is computed to be
> ac0c5c19cdbe7484046657ccb7b768c02fbbabb43166befa93b71a85d5fcf55b,
> but the sig is locked to
> cf5c3f72489f447b1199aafe4b4148988ff91cecd970422352f2238afb127683 in
> SIGGEN_LOCKEDSIGS_t-corei7-64
>
> The grub-efi-native:do_clean_grub sig is computed to be
> 4e16b100c32e9428126eb10864508038527cec795c5e4391208d96a55735c90a,
> but the sig is locked to
> a2bd26be0297624af53d6f8cf657d79740fb229db821c446d564c5ee9dc80ea3 in
> SIGGEN_LOCKEDSIGS_t-x86-64
>
> The grub-efi-native:do_compile sig is computed to be
> 630cc346f7ececf98c54f9134e8fee546e85c92f1e3c6ac3c258a1cdf24d4565,
> but the sig is locked to
> 802bba0874ce26169a9e16dcdb440795e8fa904977b036d637d6c4086ce72de8 in
> SIGGEN_LOCKEDSIGS_t-x86-64
>
> The grub-efi:do_clean_grub sig is computed to be
> faf0ae3c9159ef3ebb13d2521ecf51dfeeac0c2c47691cd0aaa80de91187af3c,
> but the sig is locked to
> 0075bbd34297bfbc62685ff5477feec11d0dd2bcda6787a151cfb7927a7f39c2 in
> SIGGEN_LOCKEDSIGS_t-corei7-64
>
> The grub-efi:do_compile sig is computed to be
> 30c09f3e8db4059b7e1ff23823f208be94d0e622904fc43eda497027be095a71,
> but the sig is locked to
> a9e8ddd9ecac11e67c66d9fccbabe23b6eb4a19c5996baef8ff960dfcdc898ed in
> SIGGEN_LOCKEDSIGS_t-corei7-64
>
> ERROR: Task quilt-native.do_fetch attempted to execute
> unexpectedly
>
> Task
> /disk0/scratch/smonsees/testSDK/layers/poky/meta/recipes-support/db/db_5.3.28.bb:do_populate_sysroot,
> unihash
> dcfb179ae99ac73583d33eec1357ff5d06fb58f160e5d7285061b6e1c9c3a9c0,
> taskhash
> dcfb179ae99ac73583d33eec1357ff5d06fb58f160e5d7285061b6e1c9c3a9c0
>
> Task
> /disk0/scratch/smonsees/testSDK/layers/poky/meta/recipes-extended/sed/sed_4.2.2.bb:do_package_write_ipk,
> unihash
> a37dc1cc0064749d1f6de69d0a9b8eab9ff6ef4089eff28a76e1851f8f8f8fe3,
> taskhash
> a37dc1cc0064749d1f6de69d0a9b8eab9ff6ef4089eff28a76e1851f8f8f8fe3
>
> Task
> /disk0/scratch/smonsees/testSDK/layers/poky/meta/recipes-support/libatomic-ops/libatomic-ops_7.6.10.bb:do_package_qa,
> unihash
> 2b17b70b3e1568840e3b39488b9e6470c89d5ffd502f02b2c129331d7609add8,
> taskhash
> 2b17b70b3e1568840e3b39488b9e6470c89d5ffd502f02b2c129331d7609add8
>
> Task
> /disk0/scratch/smonsees/testSDK/layers/poky/meta/recipes-connectivity/openssh/openssh_8.0p1.bb:do_package_qa,
> unihash
> 87a24567344a646de9ab6fba50b398e41711ff4d1bca749ebe02d84359c2a155,
> taskhash
> 87a24567344a646de9ab6fba50b398e41711ff4d1bca749ebe02d84359c2a155
>
> .
>
> .
>
>
>
https://www.mail-archive.com/search?l=yocto@...&q=subject
> :%22Re%5C%3A+%5C%5Byocto%5C%5D+eSDK+install+script+failure%22&o=newest
> &f=1
>
> <https://www.mail-archive.com/search?l=yocto@...&q=subjec
> t:%22Re%5C%3A+%5C%5Byocto%5C%5D+eSDK+install+script+failure%22&o=newes
> t&f=1>
>
>
>
https://www.yoctoproject.org/pipermail/yocto/2017-August/037359.html
>
> <https://www.yoctoproject.org/pipermail/yocto/2017-August/037359.html>
>
> https://bugzilla.yoctoproject.org/show_bug.cgi?id=12344
> <https://bugzilla.yoctoproject.org/show_bug.cgi?id=12344>
>
> *From:* yocto@...
> <mailto:yocto@...> <yocto@...
> <mailto:yocto@...>> *On Behalf Of *Monsees,
> Steven C (US) via lists.yoctoproject.org <http://lists.yoctoproject.org>
> *Sent:* Thursday, March 4, 2021 8:13 AM
> *To:* Monsees, Steven C (US) <steven.monsees@...
> <mailto:steven.monsees@...>>;
> yocto@... <mailto:yocto@...>
> *Subject:* Re: [yocto] #yocto #sdk
>
> *_External Email Alert_*
>
> *This email has been sent from an account outside of the BAE Systems
> network.*
>
> Please treat the email with caution, especially if you are requested
> to click on a link, decrypt/open an attachment, or enable macros.
> For further information on how to spot phishing, access
> “Cybersecurity OneSpace Page” and report phishing by clicking the
> button “Report Phishing” on the Outlook toolbar.
>
> Is there a list of certain classes that might interfere with the
> ability of the eSDK to lock down the configuratiuon ?
>
> Thanks,
>
> Steve
>
> *From:* yocto@...
> <mailto:yocto@...> <yocto@...
> <mailto:yocto@...>> *On Behalf Of *Monsees,
> Steven C (US) via lists.yoctoproject.org <http://lists.yoctoproject.org>
> *Sent:* Tuesday, March 2, 2021 3:26 PM
> *To:* yocto@... <mailto:yocto@...>
> *Subject:* [yocto] #yocto #sdk
>
> *_External Email Alert_*
>
> *This email has been sent from an account outside of the BAE Systems
> network.*
>
> Please treat the email with caution, especially if you are requested
> to click on a link, decrypt/open an attachment, or enable macros.
> For further information on how to spot phishing, access
> “Cybersecurity OneSpace Page” and report phishing by clicking the
> button “Report Phishing” on the Outlook toolbar.
>
> I still appear to be having an issue with the SXT SDK install…
>
> Building for zeus/x86_64 Intel based platform…
>
> I build my kernel image clean, fully functional…
>
> Standard SDK builds clean and appears functional…
>
> Ext SDK builds clean, but on install I am still seeing Error
> below…
>
> (1)What is it comparing between unhash/task hash ?, more sig issues ?
>
> (2)What is meant by “This is usually due to missing setscene tasks” ?
>
> (3)In the local.conf under the SDK they set :
>
> SSTATE_MIRRORS += " file://universal/(.*) <file://universal/(.*)>
> file://universal-4.9/\1 <file://universal-4.9/1>
> file://universal-4.9/(.*) <file://universal-4.9/(.*)>
> file://universal-4.8/\1 <file://universal-4.8/1>"
>
> Under sdk-extra.conf I set :
>
> SSTATE_MIRRORS += file://.* <file://.*>
> file:///ede/tms/yocto/zeus/sstate_cache/PATH
> <file:///ede/tms/yocto/zeus/sstate_cache/PATH>
>
> My SSTATE_MIRRIOR is based off the clean builds I mentioned above,
> is this the correct procedure ?
>
> I am trying to figure out how best to debug this issue, it is
> occurring on the post install, and everything pretty much appears in
> place.
>
> Steve
>
> 14:43 smonsees@yix490038
>
> /disk0/scratch/smonsees/yocto/workspace_3/builds2/sbcb-default/tmp/dep
> loy/sdk>./limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4
> .sh
>
> <http://limws-glibc-x86_64-aiox_orange-corei7-64-toolchain-ext-3.0.4.s
> h>
>
> LIMWS (BAE LIMWS base distro) Extensible SDK installer version
> 3.0.4
>
>
> ====================================================================
>
> Enter target directory for SDK (default: ~/limws_sdk):
> /disk0/scratch/smonsees/testSDK
>
> You are about to install the SDK to
> "/disk0/scratch/smonsees/testSDK". Proceed [Y/n]? Y
>
> Extracting
>
> SDK...................................................................
> ...........done
>
> Setting it up...
>
> Extracting buildtools...
>
> Preparing build system...
>
> Parsing recipes: 100%
> |###########################################################################################|
> Time: 0:01:32
>
> Initialising tasks: 100%
> |########################################################################################|
> Time: 0:00:04
>
> Checking sstate mirror object availability: 100%
> |################################################################|
> Time: 0:00:03
>
> ERROR: Task quilt-native.do_fetch attempted to execute
> unexpectedly
>
> Task
> /disk0/scratch/smonsees/testSDK/layers/poky/meta/recipes-support/liburcu/liburcu_0.11.1.bb:do_populate_sysroot,
> unihash
> cdb08644b85fa162bd9f88cb00113fe3193cc347e39e33e8f405f9c23f60c601,
> taskhash
> cdb08644b85fa162bd9f88cb00113fe3193cc347e39e33e8f405f9c23f60c601
>
> Task
> /disk0/scratch/smonsees/testSDK/layers/poky/meta/recipes-devtools/python/python3_3.7.8.bb:do_packagedata,
> unihash
> 925a72cbe872aad09bd3fbbe74ed1c944e9c19a732e120feae5c4784e6330d4f,
> taskhash
> 925a72cbe872aad09bd3fbbe74ed1c944e9c19a732e120feae5c4784e6330d4f
>
> .
>
> .
>
> .
>
> This is usually due to missing setscene tasks. Those missing in this
> build were:
>
> <<appears to list every recipe under ./testSDK/layers directory
> here>>
>
>
>
>
>
>
|
|
Hi,
I am building WiFi wlan Linux image on Zeus using Linux WiFi linux-firmware-sd8801 and connman, I need to add WiFi P2P support, is it sufficient to create wpa-supplicant_2.9.bbappend to enable CONFIG_P2P=y (it is disabled in wpa-supplicant_2.9 defconfig)?
Do I need to make other changes on OE configurations?
Thank you very much.
Kind regards,
jupier
|
|
M+ & H bugs with Milestone Movements WW14
All,
YP M+ or high bugs which moved to a new milestone in WW14 are listed below: Priority | Bug ID | Short Description | Changer | Owner | Was | Became | Medium+ | 13008 | toaster testing | david.reyna@... | david.reyna@... | 3.4 | 3.4 M3 | | 13109 | Implement CPE to package to Release mapping | david.reyna@... | david.reyna@... | 3.4 | 3.4 M3 | | 13288 | pseudo should not follow symlinks in /proc | randy.macleod@... | sakib.sajal@... | 3.4 | 3.3 M1 | | 13669 | Move Toaster testsuite-2 away from Testopia | david.reyna@... | david.reyna@... | 3.4 | 3.4 M3 | | 14077 | devtool doesn't handle server failing to startup gracefully | randy.macleod@... | stacygaikovaia@... | 3.3 M3 | 3.3 M4 | | 14127 | cve-check falsely indicates a vulnerabily to be patched | timothy.t.orling@... | chee.yang.lee@... | 3.3 M3 | 3.4 M1 | | 14207 | pseudo abort with wic images | timothy.t.orling@... | chee.yang.lee@... | 3.3 M3 | 3.4 M1 | | 14333 | submodule helper is not handled properly for devtool enabled workspace | randy.macleod@... | prabin.ca@... | 3.3 | 3.3 M4 |
Thanks, Stephen K. Jolley Yocto Project Program Manager ( Cell: (208) 244-4460 * Email: sjolley.yp.pm@...
|
|
3Enhancements/Bugs closed WW14!
All,
The below were the owners of enhancements or bugs closed during the last week! Who | Count | randy.macleod@... | 2 | bruce.ashfield@... | 1 | steve@... | 1 | akuster808@... | 1 | kergoth@... | 1 | richard.purdie@... | 1 | mhalstead@... | 1 | nicolas.dechesne@... | 1 | Grand Total | 9 |
Thanks, Stephen K. Jolley Yocto Project Program Manager ( Cell: (208) 244-4460 * Email: sjolley.yp.pm@...
|
|
Current high bug count owners for Yocto Project 3.3
All,
Below is the list as of top 38 bug owners as of the end of WW14 of who have open medium or higher bugs and enhancements against YP 3.3. There are 18 possible work days left until the final release candidates for YP 3.3 needs to be released. Who | Count | ross@... | 17 | bluelightning@... | 14 | richard.purdie@... | 9 | mark.morton@... | 7 | JPEWhacker@... | 6 | akuster808@... | 4 | raj.khem@... | 4 | Qi.Chen@... | 3 | timothy.t.orling@... | 3 | idadelm@... | 3 | mostthingsweb@... | 3 | trevor.gamblin@... | 3 | jeanmarie.lemetayer@... | 2 | mhalstead@... | 2 | jaewon@... | 2 | limon.anibal@... | 2 | randy.macleod@... | 2 | chee.yang.lee@... | 2 | matthewzmd@... | 2 | sakib.sajal@... | 2 | ydirson@... | 2 | alejandro@... | 2 | aehs29@... | 1 | bruce.ashfield@... | 1 | dorindabassey@... | 1 | yoctoproject@... | 1 | matt.ranostay@... | 1 | kergoth@... | 1 | pokylinux@... | 1 | devendra.tewari@... | 1 | john.kaldas.enpj@... | 1 | mshah@... | 1 | mister_rs@... | 1 | open.source@... | 1 | nicolas.dechesne@... | 1 | twoerner@... | 1 | hongxu.jia@... | 1 | mark.hatle@... | 1 | Grand Total | 112 |
Thanks, Stephen K. Jolley Yocto Project Program Manager ( Cell: (208) 244-4460 * Email: sjolley.yp.pm@...
|
|
Reminder: Yocto Project Technical Team Meeting @ Monthly from 8am on the first Tuesday (PDT)
|
|