FFME: The Advanced WPF MediaElement (based on FFmpeg)
OTHER License
⭐ Please star this project if you like it and show your appreciation via PayPal.Me
Source
dependency property has been downgraded to a notification property. Please use the asynchronous Open
and Close
methods instead.Please note the current NuGet realease might require a different version of the FFmpeg binaries than the ones of the current state of the source code.
Open Visual Studio and create a new WPF Application.
Target Framework must be set to .net 5.0 or above
Install the NuGet Package from your Package Manager Console:
PM> Install-Package FFME.Windows
Acquire the FFmpeg shared binaries (either 64 or 32 bit, depending on your app's target architecture)
by either
Building your own
I recommend the Media Autobuild Suite please don't ask for help on it here.
or
Downloading a compatible build
For a x64 build
bin
folder of both downloaded folders into a separate folder e.g c:\ffmpeg\x64
.The resulting contents of the folder e.g c:\ffmpeg\x64
should be so
Within your application's startup code (Main method)
set the Unosquare.FFME.Library.FFmpegDirectory variable to the path of the folder where the DLLs and EXEs are located, e.g.
Unosquare.FFME.Library.FFmpegDirectory = @"c:\ffmpeg";
And use the FFME MediaElement control as you would any other WPF control.
in your main window (e.g MainWindow.xaml)
Add the namespace:
xmlns:ffme="clr-namespace:Unosquare.FFME;assembly=ffme.win"
Add the FFME control:
<ffme:MediaElement x:Name="Media" Background="Gray" LoadedBehavior="Play" UnloadedBehavior="Manual" />
Play files or streams, by calling the asynchronous method, Open:
await Media.Open(new Uri(@"c:\your-file-here"));
Close the media, by calling:
await Media.Close();
Unosquare.FFME.Windows.Sample
provides usage examples for plenty of features. Use it as your main reference.FFME is an advanced and close drop-in replacement for Microsoft's WPF MediaElement Control. While the standard MediaElement uses DirectX (DirectShow) for media playback, FFME uses FFmpeg to read and decode audio and video. This means that for those of you who want to support stuff like HLS playback, or just don't want to go through the hassle of installing codecs on client machines, using FFME might just be the answer.
FFME provides multiple improvements over the standard MediaElement such as:
... all in a single MediaElement control
FFME also supports opening capture devices. See example URLs below and issue #48
device://dshow/?audio=Microphone (Vengeance 2100):video=MS Webcam 4000
device://gdigrab?title=Command Prompt
device://gdigrab?desktop
If you'd like audio to not change pitch while changing the SpeedRatio property, you'll need the SoundTouch.dll
library v2.1.1 available on the same directory as the FFmpeg binaries. You can get the SoundTouch library here.
First off, let's review a few concepts. A packet
is a group of bytes read from the input. All packets
are of a specific MediaType
(Audio, Video, Subtitle, Data), and contain some timing information and most importantly compressed data. Packets are sent to a Codec
and in turn, the codec produces Frames
. Please note that producing 1 frame
does not always take exactly 1 packet
. A packet
may contain many frames
but also a frame
may require several packets
for the decoder to build it. Frames
will contain timing informattion and the raw, uncompressed data. Now, you may think you can use frames
and show pixels on the screen or send samples to the sound card. We are close, but we still need to do some additional processing. Turns out different Codecs
will produce different uncompressed data formats. For example, some video codecs will output pixel data in ARGB, some others in RGB, and some other in YUV420. Therefore, we will need to Convert
these frames
into something all hardware can use natively. I call these converted frames, MediaBlocks
. These MediaBlocks
will contain uncompressed data in standard Audio and Video formats that all hardware is able to receive.
The process described above is implemented in 3 different layers:
MediaContainer
wraps an input stream. This layer keeps track of a MediaComponentSet
which is nothing more than a collecttion of MediaComponent
objects. Each MediaComponent
holds packet
caching, frame
decoding, and block
conversion logic. It provides the following important functionality:
Open
to open the input stream and detect the different stream components. This also determines the codecs to use.Read
to read the next available packet and store it in its corresponding component (audio, video, subtitle, data, etc)Decode
to read the following packet from the queue that each of the components hold, and return a set of frames.Convert
to turn a given frame
into a MediaBlock
.MediaEngine
wraps a MediaContainer
and it is responsible for executing commands to control the input stream (Play, Pause, Stop, Seek, etc.) while keeping keeping 3 background workers.
PacketReadingWroker
is designed to continuously read packets from the MediaContainer
. It will read packets when it needs them and it will pause if it does not need them. This is determined by how much data is in the cache. It will try to keep approximately 1 second of media packets at all times.FrameDecodingWroker
gets the packets that the PacketReadingWorker
writes and decodes them into frames. It then converts those frames into blocks
and writes them to a MediaBlockBuffer
. This block buffer can then be read by something else (the following worker described here) so its contents can be rendered.BlockRenderingWorker
reads blocks form the MediaBlockBuffer
s and sends those blocks to a plat-from specific IMediaRenderer
.MediaElement
. It wraps a MediaEngine
and it contains platform-specific implementation of methods to perform stuff like audio rendering, video rendering, subtitle rendering, and property synchronization between the MediaEngine
and itself.A high-level diagram is provided as additional reference below.
Your help is welcome!
Floyd
. See the Issues section.Please note that I am unable to distribute FFmpeg's binaries because I don't know if I am allowed to do so. Follow the instructions below to compile, run and test FFME.
c:\ffmpeg\
)App.xaml.cs
and under the constructor, locate the line Library.FFmpegDirectory = @"c:\ffmpeg"; and replace the path so that it points to the folder where you extracted your FFmpeg binaries (dll files).ffme.win.dll
.The source code for this project contains a very capable media player (FFME.Windows.Sample
) covering most of the use cases for the FFME
control. If you are just checking things out, here is a quick set of shortcut keys that ffmeplay
accepts.
Shortcut Key | Function Description |
---|---|
G | Example of toggling subtitle color |
Left | Seek 1 frame to the left |
Right | Seek 1 frame to the right |
+ / Volume Up | Increase Audio Volume |
- / Volume Down | Decrease Audio Volume |
M / Volume Mute | Mute Audio |
Up | Increase playback Speed |
Down | Decrease playback speed |
A | Cycle Through Audio Streams |
S | Cycle Through Subtitle Streams |
Q | Cycle Through Video Streams |
C | Cycle Through Closed Caption Channels |
R | Reset Changes |
Y / H | Contrast: Increase / Decrease |
U / J | Brightness: Increase / Decrease |
I / K | Saturation: Increase / Decrease |
E | Example of cycling through audio filters |
T | Capture Screenshot to desktop/ffplay folder |
W | Start/Stop recording packets (no transcoding) into a transport stream to desktop/ffplay folder. |
Double-click | Enter fullscreen |
Escape | Exit fullscreen |
Mouse Wheel Up / Down | Zoom: In / Out |
In no particular order