Linux Hardware accelerated decoders via YAMI (Yet Another Media Infrastructure) Libyami
#1
Lightbulb 
Intel and WebM developers (in partnership with Collabora and Alibaba) have released a new open source video acceleration API called YAMI (Yet Another Media Infrastructure) for Linux.

Libyami acts as a core codec library for Intel CPU integrated graphic processors to support hardware acceleration / GPU offload s well as video post process, and it is meant to replace or at least compete with VAAPI (VA API).

YAMI is currently optimized for Intel's newer (HD Graphics) integrated GPU platforms like those in Intel Baytrail, Sandybridge, Ivybridge, Haswell, Broadwell, Skylake, Braswell, Apollo Lake, Kaby Lake, and Gemini Lake processors that integrate Intel HD Graphics.

https://github.com/01org/libyami/wiki

https://github.com/01org/libyami/wiki/Build

https://github.com/01org/libyami

https://github.com/01org/libyami-utils

So far YAMI's libyami features support to hardware accelerated MPEG-2, H.264, HEVC (H.265), VP8, VP9, and JPEG ad-hoc decoding and encoding on these platforms through the Libyami library, including up to 10-bit HEVC and 10-bit VP9 if the the hardware supports it.

Also VC-1 (VC1) / WMV9 (WMV3) is being working on and only a matter of time before it too will also be supported.

Image

Libyami as a solution is more like libstagefright than VAAPI, and moreover, it supports dma_buf and other frame mode, enables texture video easily. libyami doesn't require the relatively terrible boilerplate for hwaccel (creating the hw decoder yourself, maintaining a surface pool, etc.), making this lib very attractive for those who want something simple, and who possibly don't even want to depend on FFmpeg or Libva at all. It also uses some new APIs (not supported by all drivers yet), which workaround VAAPI constraints and restraints limitations, such as requiring a shared context. Namely, the dma_buf stuff.

The future plans for libyami are pretty nice:
https://www.mail-archive.com/[email protected]

Mail list:
https://lists.01.org/mailman/listinfo/libyami

Here is an example h264dec player based on ffmpeg, with yami used in ffmpeg for decoding/encoding (and transcoding), for texture video rendering uses ffmpeg / yami:

https://github.com/01org/player-ffmpeg-yami

Patches submitted to FFmpeg

https://ffmpeg.org/pipermail/ffmpeg-deve...67382.html
https://ffmpeg.org/pipermail/ffmpeg-deve...67383.html
https://ffmpeg.org/pipermail/ffmpeg-deve...67384.html

This thread is really just a feature suggestion to consider adding support for video decoding (hw acceleration) and post-processing through libyami for Linux and Linux like platforms.
Reply
#2
FFmpeg can now use Libyami via VAAPI (VA API) for hardware offload of video transcode on the Intel GPU:

https://github.com/01org/ffmpeg_libyami

Libyami now also supports:
* MPEG-2, VC-1, WMV9, H.264, HEVC (H.265), VP8, VP9, and JPEG ad-hoc decoder for video decoding, including HEVC 10-bit and VP9 10-bit
* MPEG-2, H.264, HEVC (H.265), VP8, VP9, AVC low power CQP mode, and JPEG ad-hoc encoder for video encoding
* Sharpening, Denoise (Noise Reduction), Deinterlace (Bob, MotionAdaptive, and MotionCompensated), Color Balance, STD, CSC and scaling for video post process

Hardware requirements
* Intel Sandybridge, Ivybridge, Haswell, Broadwell, Skylake, or Kaby Lake (HD Graphics)
* Intel Bay Trail, Braswell, Apollo Lake, or Gemini Lake
Reply
#3
01.org (Intel Open Source Technology Center) developers have now released an offical FFmpeg integration of libyami via VAAPI

https://github.com/01org/ffmpeg_libyami

With this Kodi could enable FFmpeg to use Libyami/VAAPI for hardware offload of video transcode on the Intel GPU
Reply

Logout Mark Read Team Forum Stats Members Help
Hardware accelerated decoders via YAMI (Yet Another Media Infrastructure) Libyami0