Date   

SWAT Rotation

Alexandre Belloni
 

Hello Leonardo,

You are the next one on the list
(https://wiki.yoctoproject.org/wiki/Yocto_Build_Failure_Swat_Team#Members)
and SWAT duty will rotate from Jaga and Ross to you at EOD 2012-02-19.

Please reply to let me know whether you will be able to work on this
task.

I'll be available to walk you through the process on Monday, don't
hesitate to contact me by email or on IRC.

Thanks!

--
Alexandre Belloni, Bootlin
Embedded Linux and Kernel engineering
https://bootlin.com


SWAT statistics for week 06

Alexandre Belloni
 

Hello,

Here are the statistics for last week. Minjae Kim was on SWAT duty.

268 failures were reported,
* 118 were triaged by Minjae
- 25 for issues with the events patch, replied on list by Richard
- 20 were cancelled build with no other errors
- 17 for an issue in shaderc/glslang patches
- 16 meta-arm warnings for u-boot that are now fixed
- 5 meta-oe issues
- 2 for configuration issues that got fixed by Michael
- 2 for an EXTERNALSRC patch Richard replied on list
- 2 for a previously known issue, the patch was dropped at triage time
- 7 reoccurrences of bug 13802
- 1 reoccurrence of bug 13810
- 1 reoccurrence of bug 13841
- 2 reoccurrences of bug 13992
- 3 reoccurrences of bug 14029
- 1 reoccurrence of bug 14158
- 1 reoccurrence of bug 14163
- 1 reoccurrence of bug 14164
- 1 reoccurrence of bug 14209
- 7 reoccurrences of bug 14223
- 1 occurrence of new bug 14227
- 2 occurrences of new bug 14228
- 1 occurrence of new bug 14230

* 150 were triaged by Richard
- 44 for a broken patch he refreshed
- 43 were due to the events patch
- 34 were a logging patch that got fixed
- 26 were because of storage space issues on the controller
- 2 for pigz+uninative issue on a worker
- 1 new occurrence of 14183

Many of the reoccurring bugs were related to various ptests failing.

Regards,

--
Alexandre Belloni, Bootlin
Embedded Linux and Kernel engineering
https://bootlin.com


[PATCH] changelog: link back to build collection

Alexandre Belloni
 

While looking at the changelog, it is useful to be able to go back to the
build collection editing, for example to be able to correct a note.

Signed-off-by: Alexandre Belloni <alexandre.belloni@bootlin.com>
---
swatapp/templates/swatapp/changelog.html | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/swatapp/templates/swatapp/changelog.html b/swatapp/templates/swatapp/changelog.html
index 6f15196c16dd..f28abaefe187 100644
--- a/swatapp/templates/swatapp/changelog.html
+++ b/swatapp/templates/swatapp/changelog.html
@@ -5,7 +5,7 @@
{% if changes %}
<ul>
{% for change in changes %}
- <li>{{ change.user.username }} changed {{ change.failure.build.targetname }}:{{ change.failure.stepname }} {{ change.failure.build.buildid }}:{{ change.failure.id }} from {{ change.get_oldstatus_display }} to {{ change.get_newstatus_display }} on {{ change.timestamp }}: {{ change.newnote }}</li>
+ <li>{{ change.user.username }} changed {{ change.failure.build.targetname }}:{{ change.failure.stepname }} <a href="/collection/{{ change.failure.build.buildcollection.id }}/">{{ change.failure.build.buildid }}:{{ change.failure.id }}</a> from {{ change.get_oldstatus_display }} to {{ change.get_newstatus_display }} on {{ change.timestamp }}: {{ change.newnote }}</li>
{% endfor %}
</ul>

--
2.29.2


Re: [EXTERNAL] [swat] SWAT Rotation

Ross Burton
 

Hi Jaga,

On Mon, 15 Feb 2021 at 17:34, Duraisamy, Jagadheesan
<Jagadheesan_Duraisamy@comcast.com> wrote:
I can take-up this week, if you are not prepared for the short notice.
We can tag team it - a single problem caused a *lot* of failure over
the weekend that I mostly cleaned up earlier, but there's still some
outstanding in swatbot.

If you see more builds failing in do_package with UID issues, that's
https://bugzilla.yoctoproject.org/show_bug.cgi?id=14234.

Ross


Re: [EXTERNAL] [swat] SWAT Rotation

Duraisamy, Jagadheesan
 

Hello Ross,

I can take-up this week, if you are not prepared for the short notice.

Regards
Jaga

-----Original Message-----
From: swat@lists.yoctoproject.org <swat@lists.yoctoproject.org> On Behalf Of Alexandre Belloni
Sent: Monday, February 15, 2021 9:14 PM
To: Ross Burton <ross@burtonini.com>; Duraisamy, Jagadheesan <Jagadheesan_Duraisamy@comcast.com>
Cc: swat@lists.yoctoproject.org; Stephen Jolley <sjolley.yp.pm@gmail.com>
Subject: [EXTERNAL] [swat] SWAT Rotation

Hello Ross,

Since I didn't get any reply from Jagadheesan and you were the next one on the list (and we discussed that on IRC), you will be on SWAT duty this week.

I know this is short notice and I'll try to assist as much as possible if you need so.

Thanks!

--
Alexandre Belloni, Bootlin
Embedded Linux and Kernel engineering
https://urldefense.com/v3/__https://bootlin.com__;!!CQl3mcHX2A!TMu2ieTzIohPyrTJrHFefpl5O6W89R_y8P1KTHiuf11eaQr_VJtfzwe12_JPPuGIc7oTvhgpYw$


Re: [EXTERNAL] SWAT Rotation

Duraisamy, Jagadheesan
 

Hi Alexandre,

I have missed seeing the email, sorry about that.

Regards
Jaga

-----Original Message-----
From: Alexandre Belloni <alexandre.belloni@bootlin.com>
Sent: Friday, February 12, 2021 4:46 AM
To: Duraisamy, Jagadheesan <Jagadheesan_Duraisamy@comcast.com>; 김민재 <nate.kim@lge.com>
Cc: swat@lists.yoctoproject.org; Stephen Jolley <sjolley.yp.pm@gmail.com>
Subject: [EXTERNAL] SWAT Rotation

Hello Jagadheesan,

You are the next one on the list
(https://urldefense.com/v3/__https://wiki.yoctoproject.org/wiki/Yocto_Build_Failure_Swat_Team*Members__;Iw!!CQl3mcHX2A!WAsfe4k-Lc6l72AneqQQVfV_TGXwPgisGQy7uebKMaokXoEf62CYiC35K7sSqnAmi_fDbKKH9A$ ) and SWAT duty will rotate from Minjae to you at EOD 2012-02-12.

Please reply to let me know whether you will be able to work on this task.

I'll be available to walk you through the process on Monday, don't hesitate to contact me by email or on IRC.

Thanks!

--
Alexandre Belloni, Bootlin
Embedded Linux and Kernel engineering
https://urldefense.com/v3/__https://bootlin.com__;!!CQl3mcHX2A!WAsfe4k-Lc6l72AneqQQVfV_TGXwPgisGQy7uebKMaokXoEf62CYiC35K7sSqnAmi_etKECXvA$


Re: Autobuilder reproducibility target changes

Richard Purdie
 

On Sun, 2021-02-14 at 13:17 -0600, Joshua Watt wrote:
On Sun, Feb 14, 2021 at 6:19 AM Richard Purdie
<richard.purdie@linuxfoundation.org> wrote:

Regular users of the autobuilder will note that I've split the
reproducible builds test out of the main oe-selftest build and into its
own target build. This is because that test tends to run for a lot
longer time period and it helps to see the result separately.

I've only done this for master. If gatesgarth and dunfell want to
follow, that should be straight forward with a change to the branch in
autobuilder-helper. Obviously we should ensure this is working ok with
master first but so far so good.

It has already highlighted the difference between a successful run:

https://autobuilder.yoctoproject.org/typhoon/#/builders/115/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/119/builds/2
(took 3-4 hours)

and failing two failing runs:

https://autobuilder.yoctoproject.org/typhoon/#/builders/116/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/118/builds/2
(took 9 hours)
OK, I read through the code and unfortunately found a bug: when
attempting to make sure the "B" build doesn't use sstate, I misspelled
the SSTATE_MIRRORS, which means that the B build could have been
pulling from the sstate mirror when it was not supposed to. This has a
few implications:

 1) It might explain why some of the reproducible results seem intermittent
 2) It might explain why there is such a time disparity between the tests
The "good" news is that this didn't affect the autobuilder as it sets
SSTATE_DIR to a common directory and doesn't use SSTATE_MIRRORS.

Unfortunately, while it probably will help the intermittent results,
it probably means that the tests taking 9 hours is what is "supposed"
to happen, and they happen to be shorter sometimes because the B build
is pulling from sstate when it's not supposed to.
I don't think we're to the bottom of this. If its not spending the time
in diffoscope, something seems to cause builds with differences to take
much longer...

Cheers,

Richard


Re: Autobuilder reproducibility target changes

Joshua Watt <JPEWhacker@...>
 

On Mon, Feb 15, 2021 at 12:21 AM Alexander Kanavin
<alex.kanavin@gmail.com> wrote:

I’ve definitely seen diffoscope process take hours and hours and hours in local builds. Trying it with these vim packages locally should still be done.
I forgot to mention that I did run diffoscope locally with the
offending vim packages and it took about 30 seconds (same as the AB
logs showed)


Alex

On Sun 14. Feb 2021 at 20.18, Joshua Watt <jpewhacker@gmail.com> wrote:

On Sun, Feb 14, 2021 at 6:19 AM Richard Purdie
<richard.purdie@linuxfoundation.org> wrote:

Regular users of the autobuilder will note that I've split the
reproducible builds test out of the main oe-selftest build and into its
own target build. This is because that test tends to run for a lot
longer time period and it helps to see the result separately.

I've only done this for master. If gatesgarth and dunfell want to
follow, that should be straight forward with a change to the branch in
autobuilder-helper. Obviously we should ensure this is working ok with
master first but so far so good.

It has already highlighted the difference between a successful run:

https://autobuilder.yoctoproject.org/typhoon/#/builders/115/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/119/builds/2
(took 3-4 hours)

and failing two failing runs:

https://autobuilder.yoctoproject.org/typhoon/#/builders/116/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/118/builds/2
(took 9 hours)
OK, I read through the code and unfortunately found a bug: when
attempting to make sure the "B" build doesn't use sstate, I misspelled
the SSTATE_MIRRORS, which means that the B build could have been
pulling from the sstate mirror when it was not supposed to. This has a
few implications:

1) It might explain why some of the reproducible results seem intermittent
2) It might explain why there is such a time disparity between the tests

Unfortunately, while it probably will help the intermittent results,
it probably means that the tests taking 9 hours is what is "supposed"
to happen, and they happen to be shorter sometimes because the B build
is pulling from sstate when it's not supposed to.


the time difference being the system trying to run diffoscope on vim-
common :/.

I'm aware I removed some recipes from the exclusions list after seeing
multiple passing builds for all distros and we're now seeing test
failures. My mistake was not waiting for the date to change and for
builds to run on an autobuilder worker with a different umask.

Meson is failing with a pyc file mismatch which diffoscope can't decode
and despite trying for 5 hours, diffoscope hasn't given any data on why
vim-common differs. I should have fixes in for quilt, valgrind, kernel-
devsrc and cwautomacros. The umask fix may fix other issues too. Alex
has improved the reporting so we can spot cases where exclusion is now
longer needed.

Cheers,

Richard


SWAT Rotation

Alexandre Belloni
 

Hello Ross,

Since I didn't get any reply from Jagadheesan and you were the next one
on the list (and we discussed that on IRC), you will be on SWAT duty
this week.

I know this is short notice and I'll try to assist as much as possible if
you need so.

Thanks!

--
Alexandre Belloni, Bootlin
Embedded Linux and Kernel engineering
https://bootlin.com


Re: Autobuilder reproducibility target changes

Alexander Kanavin <alex.kanavin@...>
 

I’ve definitely seen diffoscope process take hours and hours and hours in local builds. Trying it with these vim packages locally should still be done.

Alex

On Sun 14. Feb 2021 at 20.18, Joshua Watt <jpewhacker@...> wrote:
On Sun, Feb 14, 2021 at 6:19 AM Richard Purdie
<richard.purdie@...> wrote:
>
> Regular users of the autobuilder will note that I've split the
> reproducible builds test out of the main oe-selftest build and into its
> own target build. This is because that test tends to run for a lot
> longer time period and it helps to see the result separately.
>
> I've only done this for master. If gatesgarth and dunfell want to
> follow, that should be straight forward with a change to the branch in
> autobuilder-helper. Obviously we should ensure this is working ok with
> master first but so far so good.
>
> It has already highlighted the difference between a successful run:
>
> https://autobuilder.yoctoproject.org/typhoon/#/builders/115/builds/2
> https://autobuilder.yoctoproject.org/typhoon/#/builders/119/builds/2
> (took 3-4 hours)
>
> and failing two failing runs:
>
> https://autobuilder.yoctoproject.org/typhoon/#/builders/116/builds/2
> https://autobuilder.yoctoproject.org/typhoon/#/builders/118/builds/2
> (took 9 hours)

OK, I read through the code and unfortunately found a bug: when
attempting to make sure the "B" build doesn't use sstate, I misspelled
the SSTATE_MIRRORS, which means that the B build could have been
pulling from the sstate mirror when it was not supposed to. This has a
few implications:

 1) It might explain why some of the reproducible results seem intermittent
 2) It might explain why there is such a time disparity between the tests

Unfortunately, while it probably will help the intermittent results,
it probably means that the tests taking 9 hours is what is "supposed"
to happen, and they happen to be shorter sometimes because the B build
is pulling from sstate when it's not supposed to.

>
> the time difference being the system trying to run diffoscope on vim-
> common :/.
>
> I'm aware I removed some recipes from the exclusions list after seeing
> multiple passing builds for all distros and we're now seeing test
> failures. My mistake was not waiting for the date to change and for
> builds to run on an autobuilder worker with a different umask.
>
> Meson is failing with a pyc file mismatch which diffoscope can't decode
> and despite trying for 5 hours, diffoscope hasn't given any data on why
> vim-common differs. I should have fixes in for quilt, valgrind, kernel-
> devsrc and cwautomacros. The umask fix may fix other issues too. Alex
> has improved the reporting so we can spot cases where exclusion is now
> longer needed.
>
> Cheers,
>
> Richard
>


do_package unknown user build failure

Richard Purdie
 

Hi All,

There are a number of failures on the autobuilder in do_package with
odd unknown user issues. My guess is that its related to the new
buildtools tarball I configured in the helper. I'm going to guess we're
missing a glibc syscall with the new glibc.

I'll look into it as a priority.

Cheers,

Richard


Re: Autobuilder reproducibility target changes

Joshua Watt <JPEWhacker@...>
 

On Sun, Feb 14, 2021 at 6:19 AM Richard Purdie
<richard.purdie@linuxfoundation.org> wrote:

Regular users of the autobuilder will note that I've split the
reproducible builds test out of the main oe-selftest build and into its
own target build. This is because that test tends to run for a lot
longer time period and it helps to see the result separately.

I've only done this for master. If gatesgarth and dunfell want to
follow, that should be straight forward with a change to the branch in
autobuilder-helper. Obviously we should ensure this is working ok with
master first but so far so good.

It has already highlighted the difference between a successful run:

https://autobuilder.yoctoproject.org/typhoon/#/builders/115/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/119/builds/2
(took 3-4 hours)

and failing two failing runs:

https://autobuilder.yoctoproject.org/typhoon/#/builders/116/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/118/builds/2
(took 9 hours)
OK, I read through the code and unfortunately found a bug: when
attempting to make sure the "B" build doesn't use sstate, I misspelled
the SSTATE_MIRRORS, which means that the B build could have been
pulling from the sstate mirror when it was not supposed to. This has a
few implications:

1) It might explain why some of the reproducible results seem intermittent
2) It might explain why there is such a time disparity between the tests

Unfortunately, while it probably will help the intermittent results,
it probably means that the tests taking 9 hours is what is "supposed"
to happen, and they happen to be shorter sometimes because the B build
is pulling from sstate when it's not supposed to.


the time difference being the system trying to run diffoscope on vim-
common :/.

I'm aware I removed some recipes from the exclusions list after seeing
multiple passing builds for all distros and we're now seeing test
failures. My mistake was not waiting for the date to change and for
builds to run on an autobuilder worker with a different umask.

Meson is failing with a pyc file mismatch which diffoscope can't decode
and despite trying for 5 hours, diffoscope hasn't given any data on why
vim-common differs. I should have fixes in for quilt, valgrind, kernel-
devsrc and cwautomacros. The umask fix may fix other issues too. Alex
has improved the reporting so we can spot cases where exclusion is now
longer needed.

Cheers,

Richard


Re: Autobuilder reproducibility target changes

Joshua Watt <JPEWhacker@...>
 

On Sun, Feb 14, 2021 at 6:19 AM Richard Purdie
<richard.purdie@linuxfoundation.org> wrote:

Regular users of the autobuilder will note that I've split the
reproducible builds test out of the main oe-selftest build and into its
own target build. This is because that test tends to run for a lot
longer time period and it helps to see the result separately.

I've only done this for master. If gatesgarth and dunfell want to
follow, that should be straight forward with a change to the branch in
autobuilder-helper. Obviously we should ensure this is working ok with
master first but so far so good.

It has already highlighted the difference between a successful run:

https://autobuilder.yoctoproject.org/typhoon/#/builders/115/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/119/builds/2
(took 3-4 hours)

and failing two failing runs:

https://autobuilder.yoctoproject.org/typhoon/#/builders/116/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/118/builds/2
(took 9 hours)

the time difference being the system trying to run diffoscope on vim-
common :/.
I'm not sure that diffoscope is the culprit here. If you look at the
logs, you can see that there is only about 30 seconds between the
"Running diffoscope" log message and the end of the test. I suspect
something else is going wrong here. I can try to write up patch to try
and add more logging so we can more accurately pinpoint where it's
taking so long.



I'm aware I removed some recipes from the exclusions list after seeing
multiple passing builds for all distros and we're now seeing test
failures. My mistake was not waiting for the date to change and for
builds to run on an autobuilder worker with a different umask.

Meson is failing with a pyc file mismatch which diffoscope can't decode
and despite trying for 5 hours, diffoscope hasn't given any data on why
vim-common differs. I should have fixes in for quilt, valgrind, kernel-
devsrc and cwautomacros. The umask fix may fix other issues too. Alex
has improved the reporting so we can spot cases where exclusion is now
longer needed.

Cheers,

Richard


Re: Autobuilder reproducibility target changes

Alexander Kanavin <alex.kanavin@...>
 

On Sun, 14 Feb 2021 at 16:17, Richard Purdie <richard.purdie@...> wrote:
On Sun, 2021-02-14 at 16:04 +0100, Alexander Kanavin wrote:
> Cheers :) If there's something else I could help, tell.

I'm going to try and get diffs of the remaining big package differences
and see where we stand. The two big ones I know of are go as a language
for reproducibility and perf. Perf will just be a shear pain to fix,
hopefully the kernel will take patches.

I did a bit of work on the go reproducibility, that has been preserved here:

I got there, but it took quite a bit of time (debugging go build process is extremely painful) and I'm not at all happy with the hacky/brittle things in the patch, so it's on hold for now - but anyone is welcome to take it and make it better, especially if they're go specialists.

Alex


Re: Autobuilder reproducibility target changes

Richard Purdie
 

On Sun, 2021-02-14 at 16:04 +0100, Alexander Kanavin wrote:
Cheers :) If there's something else I could help, tell.
I'm going to try and get diffs of the remaining big package differences
and see where we stand. The two big ones I know of are go as a language
for reproducibility and perf. Perf will just be a shear pain to fix,
hopefully the kernel will take patches.

One item is getting to the bottom of why it takes diffoscope beyond
the heat death of the universe to render its verdict on some items.
That would be really helpful to get to the bottom of. The vim-common
difference is actually really simple:

https://autobuilder.yocto.io/pub/repro-fail/oe-reproducible-20210213-0djxo1sn/packages/diff-html/

so I don't know why it took 5 hours to compute that. It suggests
something really silly/stupid is going on. diffoscope should be
amenable to fixes so it would be worth talking to them too...

(I have a fix for the locale problem in vim brewing, it needs a new
buildtools-extended-tarball)

Cheers,

Richard


Re: Autobuilder reproducibility target changes

Alexander Kanavin <alex.kanavin@...>
 

Cheers :) If there's something else I could help, tell.

One item is getting to the bottom of why it takes diffoscope beyond the heat death of the universe to render its verdict on some items.

Alex


On Sun, 14 Feb 2021 at 13:19, Richard Purdie <richard.purdie@...> wrote:
Regular users of the autobuilder will note that I've split the
reproducible builds test out of the main oe-selftest build and into its
own target build. This is because that test tends to run for a lot
longer time period and it helps to see the result separately.

I've only done this for master. If gatesgarth and dunfell want to
follow, that should be straight forward with a change to the branch in
autobuilder-helper. Obviously we should ensure this is working ok with
master first but so far so good.

It has already highlighted the difference between a successful run:

https://autobuilder.yoctoproject.org/typhoon/#/builders/115/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/119/builds/2
(took 3-4 hours)

and failing two failing runs:

https://autobuilder.yoctoproject.org/typhoon/#/builders/116/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/118/builds/2
(took 9 hours)

the time difference being the system trying to run diffoscope on vim-
common :/.

I'm aware I removed some recipes from the exclusions list after seeing
multiple passing builds for all distros and we're now seeing test
failures. My mistake was not waiting for the date to change and for
builds to run on an autobuilder worker with a different umask.

Meson is failing with a pyc file mismatch which diffoscope can't decode
and despite trying for 5 hours, diffoscope hasn't given any data on why
vim-common differs. I should have fixes in for quilt, valgrind, kernel-
devsrc and cwautomacros. The umask fix may fix other issues too. Alex
has improved the reporting so we can spot cases where exclusion is now
longer needed.

Cheers,

Richard


Autobuilder reproducibility target changes

Richard Purdie
 

Regular users of the autobuilder will note that I've split the
reproducible builds test out of the main oe-selftest build and into its
own target build. This is because that test tends to run for a lot
longer time period and it helps to see the result separately.

I've only done this for master. If gatesgarth and dunfell want to
follow, that should be straight forward with a change to the branch in
autobuilder-helper. Obviously we should ensure this is working ok with
master first but so far so good.

It has already highlighted the difference between a successful run:

https://autobuilder.yoctoproject.org/typhoon/#/builders/115/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/119/builds/2
(took 3-4 hours)

and failing two failing runs:

https://autobuilder.yoctoproject.org/typhoon/#/builders/116/builds/2
https://autobuilder.yoctoproject.org/typhoon/#/builders/118/builds/2
(took 9 hours)

the time difference being the system trying to run diffoscope on vim-
common :/.

I'm aware I removed some recipes from the exclusions list after seeing
multiple passing builds for all distros and we're now seeing test
failures. My mistake was not waiting for the date to change and for
builds to run on an autobuilder worker with a different umask.

Meson is failing with a pyc file mismatch which diffoscope can't decode
and despite trying for 5 hours, diffoscope hasn't given any data on why
vim-common differs. I should have fixes in for quilt, valgrind, kernel-
devsrc and cwautomacros. The umask fix may fix other issues too. Alex
has improved the reporting so we can spot cases where exclusion is now
longer needed.

Cheers,

Richard


SWAT Rotation

Alexandre Belloni
 

Hello Jagadheesan,

You are the next one on the list
(https://wiki.yoctoproject.org/wiki/Yocto_Build_Failure_Swat_Team#Members)
and SWAT duty will rotate from Minjae to you at EOD 2012-02-12.

Please reply to let me know whether you will be able to work on this
task.

I'll be available to walk you through the process on Monday, don't
hesitate to contact me by email or on IRC.

Thanks!

--
Alexandre Belloni, Bootlin
Embedded Linux and Kernel engineering
https://bootlin.com


SWAT statistics for week 05

Alexandre Belloni
 

Hello,

Here are the statistics for last week. Jon Mason was on SWAT duty.

413 failures were reported,
* 145 were triaged by Jon
- 6 were not to be triaged
- 7 were due to out of disk space issues
- 65 for a binutils issue Richard already sent an email for
- 21 were already fixed at the time they were triaged
- 11 for 3 issues for which emails were sent
- 8 were reoccurences of bugs 13802, 14002, 14170, 14181, 14200
- 10 occurences of new bug 14210
- 2 occurences of new bug 14212
- 3 occurences of new bug 14221
- 2 occurences of new bug 14222
- 3 occurences of new bug 14223
- 2 occurences of new bug 14224
- 1 for new bug 14225
- 4 occurences of new bug 14226

* 154 were triaged by Richard
- 148 because of a broken glibc upgrade patch
- 5 for a bitbake multiconfig change
- 1 was added to bug 14201

* 113 were triaged by me
- 106 were not to be triaged
- 2 were meta-oe build failures due to the autconf update
- 2 were cancelled with no other errors
- 1 was added to bug 14029
- 1 new bug opened: 14213
- 1 was due to the qmp patch and being handled by Saul

Again, the raw number of failures is not representative of the work that
has been done. Swatbot is now filtering the failures that are not to be
triaged or the cancellations without any other issues so we won't have
those anymore.

Regards,

--
Alexandre Belloni, Bootlin
Embedded Linux and Kernel engineering
https://bootlin.com


Controller out of space

Richard Purdie
 

Hi,

The autobuilder controller (typhoon) ran out of disk space and I ended
up having to restart buildbot after clearing some space with a
temporary fix until Michael can look at it. That meant the running
builds were interrupted and didn't restart automatically.

Cheers,

Richard

121 - 140 of 193