Vivante graphics driver


Andy Pont
 

Hello,

What is the difference between the “Vivante GCCore” driver (CONFIG_DRM_VIVANTE) available in the 4.14.x kernel and the galcore.ko file that is being built and to the kernel modules directory on the target? Do they do the same job? Is it one or the other - in which case which is best?

Secondly, in the tmp/work/…/imx-gpu-viv directory there is a file called imx-gpu-viv-6.2.4.p4.0-aarch32.bin and likewise in tmp/work/…/imx-gpu-g2d there is a imx-gpu-g2d-6.2.4.p4.0-arm.bin. I presume that these are some kind of firmware file but as far as I can see they aren’t making it into the file system on the target. Do I need to do something with them?

-Andy.


Otavio Salvador <otavio.salvador@...>
 

Hello Andy,

On Mon, Oct 28, 2019 at 7:16 AM Andy Pont <andy.pont@sdcsystems.com> wrote:
What is the difference between the “Vivante GCCore” driver
(CONFIG_DRM_VIVANTE) available in the 4.14.x kernel and the galcore.ko
file that is being built and to the kernel modules directory on the
target? Do they do the same job? Is it one or the other - in which
case which is best?
We use the kernel module as a way to easy the upgrade of GPU driver
releases. If using the built-in the Linux kernel version needs to
match the GPU driver release while if using the external kernel
module, this is not a requirement.

Another reason is that the external kernel module bring fixes we do as
community.


Secondly, in the tmp/work/…/imx-gpu-viv directory there is a file called
imx-gpu-viv-6.2.4.p4.0-aarch32.bin and likewise in
tmp/work/…/imx-gpu-g2d there is a imx-gpu-g2d-6.2.4.p4.0-arm.bin. I
presume that these are some kind of firmware file but as far as I can
see they aren’t making it into the file system on the target. Do I need
to do something with them?
The version used depends on the target you are building for.

--
Otavio Salvador O.S. Systems
http://www.ossystems.com.br http://code.ossystems.com.br
Mobile: +55 (53) 9 9981-7854 Mobile: +1 (347) 903-9750


Andy Pont
 

Otavio wrote…

We use the kernel module as a way to easy the upgrade of GPU driver
releases. If using the built-in the Linux kernel version needs to
match the GPU driver release while if using the external kernel
module, this is not a requirement.

Another reason is that the external kernel module bring fixes we do as
community.
OK, so I think I will disable the built in version and use the external kernel module.

Secondly, in the tmp/work/…/imx-gpu-viv directory there is a file called
imx-gpu-viv-6.2.4.p4.0-aarch32.bin and likewise in
tmp/work/…/imx-gpu-g2d there is a imx-gpu-g2d-6.2.4.p4.0-arm.bin. I
presume that these are some kind of firmware file but as far as I can
see they aren’t making it into the file system on the target. Do I need
to do something with them?
The version used depends on the target you are building for.
I am building for a Boundary Devices Nitrogen 6 Lite board (i.MX6 Solo) using Yocto and the linux-boundary-4.14 kernel recipe.

We seem to be having really poor performance with canvas events in our HTML5/CSS front end and some of the benchmarks[1] we have run only seem to make 10fps.

-Andy

1 - https://themaninblue.com/experiment/AnimationBenchmark/canvas/


Otavio Salvador <otavio.salvador@...>
 

On Mon, Oct 28, 2019 at 10:48 AM Andy Pont <andy.pont@sdcsystems.com> wrote:
We use the kernel module as a way to easy the upgrade of GPU driver
releases. If using the built-in the Linux kernel version needs to
match the GPU driver release while if using the external kernel
module, this is not a requirement.

Another reason is that the external kernel module bring fixes we do as
community.
OK, so I think I will disable the built in version and use the external
kernel module.
Good.

Secondly, in the tmp/work/…/imx-gpu-viv directory there is a file called
imx-gpu-viv-6.2.4.p4.0-aarch32.bin and likewise in
tmp/work/…/imx-gpu-g2d there is a imx-gpu-g2d-6.2.4.p4.0-arm.bin. I
presume that these are some kind of firmware file but as far as I can
see they aren’t making it into the file system on the target. Do I need
to do something with them?
The version used depends on the target you are building for.
I am building for a Boundary Devices Nitrogen 6 Lite board (i.MX6 Solo)
using Yocto and the linux-boundary-4.14 kernel recipe.

We seem to be having really poor performance with canvas events in our
HTML5/CSS front end and some of the benchmarks[1] we have run only seem
to make 10fps.
Using what? WPE, Chromium, WebEngine?

--
Otavio Salvador O.S. Systems
http://www.ossystems.com.br http://code.ossystems.com.br
Mobile: +55 (53) 9 9981-7854 Mobile: +1 (347) 903-9750


Andy Pont
 

Otavio wrote..

>> Secondly, in the tmp/work/…/imx-gpu-viv directory there is a file called
>> imx-gpu-viv-6.2.4.p4.0-aarch32.bin and likewise in
>> tmp/work/…/imx-gpu-g2d there is a imx-gpu-g2d-6.2.4.p4.0-arm.bin. I
>> presume that these are some kind of firmware file but as far as I can
>> see they aren’t making it into the file system on the target. Do I need
>> to do something with them?
>
>The version used depends on the target you are building for.
I am building for a Boundary Devices Nitrogen 6 Lite board (i.MX6 Solo)
using Yocto and the linux-boundary-4.14 kernel recipe.
 
We seem to be having really poor performance with canvas events in our
HTML5/CSS front end and some of the benchmarks[1] we have run only seem
to make 10fps.
 
Using what? WPE, Chromium, WebEngine?
Cog (v0.4.0) and WPE WebKit (v2.26.1) from meta-webkit. It is building with the following defined:

PREFERRED_PROVIDER_virtual/wpebackend = "wpebackend-rdk”
PACKAGECONFIG_pn-wpebackend-rdk = “imx6”
PACKAGECONFIG_append_pn-cairo = " glesv2 egl”
PACKAGECONFIG_append_pn-wpewebkit = “ 2dcanvas”

We aren’t necessarily fixed with using Cog and WPE WebKit so long as we can sit directly on top of the framebuffer without having to introduce the extra overhead of Wayland or X11.

-Andy.


Andy Pont
 

Otavio wrote...

>We use the kernel module as a way to easy the upgrade of GPU driver
>releases. If using the built-in the Linux kernel version needs to
>match the GPU driver release while if using the external kernel
>module, this is not a requirement.
>
>Another reason is that the external kernel module bring fixes we do as
>community.
OK, so I think I will disable the built in version and use the external
kernel module.
Good.
So, switching from the built-in kernel driver for the GPU to the external one hasn’t done anything to improve the frame rate. Not that I really expected it to do so.

My guess at the present time is that the WPE WebKit implementation isn’t properly exploiting the acceleration from the GPU. I’m not sure at the moment how to prove it or how to fix it.

-Andy.


Andy Pont
 

Otavio wrote...

We seem to be having really poor performance with canvas events in our
HTML5/CSS front end and some of the benchmarks[1] we have run only seem
to make 10fps.
Using what? WPE, Chromium, WebEngine?
I have been looking at the contents of your meta-browser layer and also searching around on the internet and have found a number of references to Ozone and DRM/GBM but haven’t been able to find a recipe to build that configuration. Is it something that can be made to work for our project?

-Andy.


Andy Pont
 

I wrote...

>> Secondly, in the tmp/work/…/imx-gpu-viv directory there is a file called
>> imx-gpu-viv-6.2.4.p4.0-aarch32.bin and likewise in
>> tmp/work/…/imx-gpu-g2d there is a imx-gpu-g2d-6.2.4.p4.0-arm.bin. I
>> presume that these are some kind of firmware file but as far as I can
>> see they aren’t making it into the file system on the target. Do I need
>> to do something with them?
>
>The version used depends on the target you are building for.
I am building for a Boundary Devices Nitrogen 6 Lite board (i.MX6 Solo)
using Yocto and the linux-boundary-4.14 kernel recipe.
 
We seem to be having really poor performance with canvas events in our
HTML5/CSS front end and some of the benchmarks[1] we have run only seem
to make 10fps.
 
Using what? WPE, Chromium, WebEngine?
Cog (v0.4.0) and WPE WebKit (v2.26.1) from meta-webkit. It is building with the following defined:

...

We aren’t necessarily fixed with using Cog and WPE WebKit so long as we can sit directly on top of the framebuffer without having to introduce the extra overhead of Wayland or X11.
I am trying to get run some comparative tests by building the chromium-ozone-wayland recipe from meta-browser but the build is failing with the following error:

meson.build:427:4: ERROR: Problem encountered: building dri drivers require at least one windowing system or classic osmesa

What do I need to add to my local.conf in order to get this to build?

-Andy.


Andrey Zhizhikin
 

On Mon, Nov 4, 2019 at 4:05 PM Andy Pont <andy.pont@sdcsystems.com> wrote:

I wrote...

Secondly, in the tmp/work/…/imx-gpu-viv directory there is a file called
imx-gpu-viv-6.2.4.p4.0-aarch32.bin and likewise in
tmp/work/…/imx-gpu-g2d there is a imx-gpu-g2d-6.2.4.p4.0-arm.bin. I
presume that these are some kind of firmware file but as far as I can
see they aren’t making it into the file system on the target. Do I need
to do something with them?
The version used depends on the target you are building for.
I am building for a Boundary Devices Nitrogen 6 Lite board (i.MX6 Solo)
using Yocto and the linux-boundary-4.14 kernel recipe.

We seem to be having really poor performance with canvas events in our
HTML5/CSS front end and some of the benchmarks[1] we have run only seem
to make 10fps.


Using what? WPE, Chromium, WebEngine?

Cog (v0.4.0) and WPE WebKit (v2.26.1) from meta-webkit. It is building with the following defined:

...

We aren’t necessarily fixed with using Cog and WPE WebKit so long as we can sit directly on top of the framebuffer without having to introduce the extra overhead of Wayland or X11.

I am trying to get run some comparative tests by building the chromium-ozone-wayland recipe from meta-browser but the build is failing with the following error:

meson.build:427:4: ERROR: Problem encountered: building dri drivers require at least one windowing system or classic osmesa
This was the issue #115
(https://github.com/Freescale/meta-freescale/issues/115), which has
been resolved by PR #151
(https://github.com/Freescale/meta-freescale/pull/151) for master
branch. Are you're trying to build from master or using the warrior
branch?

In case of warrior, I guess that PR need to be back-ported from master
to make meson build operable.


What do I need to add to my local.conf in order to get this to build?
There is actually nothing you can add here, mesa build should be
patched instead.


-Andy.

--
_______________________________________________
meta-freescale mailing list
meta-freescale@yoctoproject.org
https://lists.yoctoproject.org/listinfo/meta-freescale
--
Regards,
Andrey.


Andy Pont
 

Andrey wrote...

I am trying to get run some comparative tests by building the chromium-ozone-wayland recipe from meta-browser but the build is failing with the following error:

meson.build:427:4: ERROR: Problem encountered: building dri drivers require at least one windowing system or classic osmesa
This was the issue #115
(https://github.com/Freescale/meta-freescale/issues/115), which has
been resolved by PR #151
(https://github.com/Freescale/meta-freescale/pull/151) for master
branch. Are you're trying to build from master or using the warrior
branch?
I am building from master and that pull request appears to be included but it looks to be specific for the i.MX8 whereas we are building for the i.MX6. I’ll try hacking the recipe accordingly and give it a try.

-Andy.


Andy Pont
 

I wrote...
I am trying to get run some comparative tests by building the chromium-ozone-wayland recipe from meta-browser but the build is failing with the following error:
 
meson.build:427:4: ERROR: Problem encountered: building dri drivers require at least one windowing system or classic osmesa
 
This was the issue #115
(https://github.com/Freescale/meta-freescale/issues/115), which has
been resolved by PR #151
(https://github.com/Freescale/meta-freescale/pull/151) for master
branch. Are you're trying to build from master or using the warrior
branch?
I am building from master and that pull request appears to be included but it looks to be specific for the i.MX8 whereas we are building for the i.MX6. I’ll try hacking the recipe accordingly and give it a try.
OK, so I now have mesa building without any issues. It now stops in the do_configure phase of chromium-ozone-wayland with the following output:

| Command: python /home/.../tmp/work/cortexa9t2hf-neon-mx6qdl-fslc-linux-gnueabi/chromium-ozone-wayland/77.0.3865.120-r0/chromium-77.0.3865.120/build/config/linux/pkg-config.py --dridriverdir dri
| Returned 1 and printed out:
| Error from pkg-config.
| stderr:
| Package dri was not found in the pkg-config search path.
| Perhaps you should add the directory containing `dri.pc’
| to the PKG_CONFIG_PATH environment variable
| No package 'dri’ found
| See //content/gpu/BUILD.gn:129:18: which caused the file to be included.
| configs += [ "//build/config/linux/dri” ]
| ^————————————

This feels like it is the last issue to resolve!

-Andy.