Class Muxer

java.lang.Object
io.antmedia.muxer.Muxer
Direct Known Subclasses:
HLSMuxer, RecordMuxer, RtmpMuxer

public abstract class Muxer extends Object
PLEASE READ HERE BEFORE YOU IMPLEMENT A MUXER THAT INHERITS THIS CLASS One muxer can be used by multiple encoder so some functions(init, writeTrailer) may be called multiple times, save functions with guards and sync blocks Muxer MUST NOT changed packet content somehow, data, stream index, pts, dts, duration, etc. because packets are shared with other muxers. If packet content changes, other muxer cannot do their job correctly. Muxers generally run in multi-thread environment so that writePacket functions can be called by different thread at the same time. Protect writePacket with synchronized keyword
Author:
mekya
  • Field Details

    • BITSTREAM_FILTER_HEVC_MP4TOANNEXB

      public static final String BITSTREAM_FILTER_HEVC_MP4TOANNEXB
      See Also:
    • BITSTREAM_FILTER_H264_MP4TOANNEXB

      public static final String BITSTREAM_FILTER_H264_MP4TOANNEXB
      See Also:
    • currentVoDTimeStamp

      private long currentVoDTimeStamp
    • extension

      protected String extension
    • format

      protected String format
    • isInitialized

      protected boolean isInitialized
    • options

      protected Map<String,String> options
    • logger

      protected org.slf4j.Logger logger
    • loggerStatic

      protected static org.slf4j.Logger loggerStatic
    • outputFormatContext

      protected org.bytedeco.ffmpeg.avformat.AVFormatContext outputFormatContext
    • DATE_TIME_PATTERN

      public static final String DATE_TIME_PATTERN
      See Also:
    • file

      protected File file
    • vertx

      protected io.vertx.core.Vertx vertx
    • scope

      protected IScope scope
    • addDateTimeToResourceName

      private boolean addDateTimeToResourceName
    • isRunning

      protected AtomicBoolean isRunning
    • videoExtradata

      protected byte[] videoExtradata
    • TEMP_EXTENSION

      public static final String TEMP_EXTENSION
      See Also:
    • time2log

      protected int time2log
    • audioPkt

      protected org.bytedeco.ffmpeg.avcodec.AVPacket audioPkt
    • registeredStreamIndexList

      protected List<Integer> registeredStreamIndexList
    • bsfVideoNames

      protected Set<String> bsfVideoNames
      Bitstream filter name that will be applied to packets
    • bsfAudioNames

      private Set<String> bsfAudioNames
    • streamId

      protected String streamId
    • inputTimeBaseMap

      protected Map<Integer,org.bytedeco.ffmpeg.avutil.AVRational> inputTimeBaseMap
    • bsfFilterContextList

      protected List<org.bytedeco.ffmpeg.avcodec.AVBSFContext> bsfFilterContextList
    • bsfAudioFilterContextList

      protected Set<org.bytedeco.ffmpeg.avcodec.AVBSFContext> bsfAudioFilterContextList
    • videoWidth

      protected int videoWidth
    • videoHeight

      protected int videoHeight
    • headerWritten

      protected volatile boolean headerWritten
    • initialResourceNameWithoutExtension

      protected String initialResourceNameWithoutExtension
      This is the initial original resource name without any suffix such _1, _2, or .mp4, .webm
    • tmpPacket

      protected org.bytedeco.ffmpeg.avcodec.AVPacket tmpPacket
    • firstAudioDts

      protected long firstAudioDts
    • firstVideoDts

      protected long firstVideoDts
    • videoPkt

      protected org.bytedeco.ffmpeg.avcodec.AVPacket videoPkt
    • rotation

      protected int rotation
    • SEGMENT_INDEX_LENGTH

      public static final int SEGMENT_INDEX_LENGTH
      ts and m4s files index length
      See Also:
    • inputOutputStreamIndexMap

      protected Map<Integer,Integer> inputOutputStreamIndexMap
    • resolution

      private int resolution
      height of the resolution
    • avRationalTimeBase

      public static final org.bytedeco.ffmpeg.avutil.AVRational avRationalTimeBase
    • subFolder

      protected String subFolder
    • firstKeyFrameReceived

      protected boolean firstKeyFrameReceived
      By default first video key frame are not checked, so it's true. If the first video key frame should be checked, make this setting to false. It's being used in RecordMuxer and HLSMuxer
    • lastPts

      private long lastPts
    • optionDictionary

      protected org.bytedeco.ffmpeg.avutil.AVDictionary optionDictionary
    • firstPacketDtsMs

      private long firstPacketDtsMs
    • audioNotWrittenCount

      private long audioNotWrittenCount
    • videoNotWrittenCount

      private long videoNotWrittenCount
    • totalSizeInBytes

      private long totalSizeInBytes
    • startTimeInSeconds

      private long startTimeInSeconds
    • currentTimeInSeconds

      private long currentTimeInSeconds
    • videoCodecId

      private int videoCodecId
    • appInstance

      private IAntMediaStreamHandler appInstance
  • Constructor Details

    • Muxer

      protected Muxer(io.vertx.core.Vertx vertx)
  • Method Details

    • getPreviewFile

      public static File getPreviewFile(IScope scope, String name, String extension)
    • getRecordFile

      public static File getRecordFile(IScope scope, String name, String extension, String subFolder)
    • getUserRecordFile

      public static File getUserRecordFile(IScope scope, String userVoDFolder, String name)
    • addStream

      public boolean addStream(org.bytedeco.ffmpeg.avcodec.AVCodec codec, org.bytedeco.ffmpeg.avcodec.AVCodecContext codecContext, int streamIndex)
      Add a new stream with this codec, codecContext and stream Index parameters. After adding streams, need to call prepareIO() This method is called by encoder. After encoder is opened, it adds codec context to the muxer
      Parameters:
      codec -
      codecContext -
      streamIndex -
      Returns:
    • getOutputURL

      public String getOutputURL()
    • openIO

      public boolean openIO()
    • prepareIO

      public boolean prepareIO()
      This function may be called by multiple encoders. Make sure that it is called once. See the sample implementations how it is being protected Implement this function with synchronized keyword as the subclass
      Returns:
    • writeHeader

      public boolean writeHeader()
    • writeTrailer

      public void writeTrailer()
      This function may be called by multiple encoders. Make sure that it is called once. See the sample implementations how it is being protected Implement this function with synchronized keyword as the subclass
    • clearResource

      protected void clearResource()
    • writePacket

      public void writePacket(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, org.bytedeco.ffmpeg.avformat.AVStream stream)
      Write packets to the output. This function is used in by MuxerAdaptor which is in community edition Check if outputContext.pb is not null for the ffmpeg base Muxers Implement this function with synchronized keyword as the subclass
      Parameters:
      pkt - The content of the data as a AVPacket object
    • logPacketIssue

      public void logPacketIssue(String format, Object... arguments)
    • writePacket

      public void writePacket(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, org.bytedeco.ffmpeg.avcodec.AVCodecContext codecContext)
      Write packets to the output. This function is used in transcoding. Previously, It's the replacement of {link #writePacket(AVPacket)
      Parameters:
      pkt -
      codecContext -
    • getPacketBufferWithExtradata

      public ByteBuffer getPacketBufferWithExtradata(byte[] extradata, org.bytedeco.ffmpeg.avcodec.AVPacket pkt)
    • setAudioBitreamFilter

      public void setAudioBitreamFilter(String bsfName)
    • getBsfAudioNames

      public Set<String> getBsfAudioNames()
    • setBitstreamFilter

      public void setBitstreamFilter(String bsfName)
    • getBitStreamFilter

      public String getBitStreamFilter()
    • getFile

      public File getFile()
    • getFileName

      public String getFileName()
    • getFormat

      public String getFormat()
    • init

      public void init(IScope scope, String name, int resolution, String subFolder, int videoBitrate)
      Inits the file to write. Multiple encoders can init the muxer. It is redundant to init multiple times.
    • init

      public void init(IScope scope, String name, int resolution, boolean overrideIfExist, String subFolder, int bitrate)
      Init file name file format is NAME[-{DATETIME}][_{RESOLUTION_HEIGHT}p_{BITRATE}kbps].{EXTENSION} Datetime format is yyyy-MM-dd_HH-mm We are using "-" instead of ":" in HH:mm -> Stream filename must not contain ":" character. sample naming -> stream1-yyyy-MM-dd_HH-mm_480p_500kbps.mp4 if datetime is added stream1_480p.mp4 if no datetime
      Parameters:
      name - , name of the stream
      scope -
      resolution - height of the stream, if it is zero, then no resolution will be added to resource name
      overrideIfExist - whether override if a file exists with the same name
      bitrate - bitrate of the stream, if it is zero, no bitrate will be added to resource name
    • setSubfolder

      public void setSubfolder(String subFolder)
    • getAppSettings

      public AppSettings getAppSettings()
    • getAppAdaptor

      public AntMediaApplicationAdapter getAppAdaptor()
    • getExtendedName

      public String getExtendedName(String name, int resolution, int bitrate, String fileNameFormat)
    • extractCustomText

      private String extractCustomText(String fileNameFormat)
    • getResourceFile

      public File getResourceFile(IScope scope, String name, String extension, String subFolder)
    • isAddDateTimeToSourceName

      public boolean isAddDateTimeToSourceName()
    • setAddDateTimeToSourceName

      public void setAddDateTimeToSourceName(boolean addDateTimeToSourceName)
    • addVideoStream

      public boolean addVideoStream(int width, int height, org.bytedeco.ffmpeg.avutil.AVRational timebase, int codecId, int streamIndex, boolean isAVC, org.bytedeco.ffmpeg.avcodec.AVCodecParameters codecpar)
      Add video stream to the muxer with direct parameters. This method is called when there is a WebRTC ingest and there is no adaptive streaming
      Parameters:
      width - , video width
      height - , video height
      codecId - , codec id of the stream
      streamIndex - , stream index
      isAVC - , true if packets are in AVC format, false if in annexb format
      Returns:
      true if successful, false if failed
    • addAudioStream

      public boolean addAudioStream(int sampleRate, org.bytedeco.ffmpeg.avutil.AVChannelLayout channelLayout, int codecId, int streamIndex)
      Add audio stream to the muxer.
      Parameters:
      sampleRate -
      channelLayout -
      codecId -
      streamIndex - , is the stream index of source
      Returns:
    • avNewStream

      public org.bytedeco.ffmpeg.avformat.AVStream avNewStream(org.bytedeco.ffmpeg.avformat.AVFormatContext context)
    • addStream

      public boolean addStream(org.bytedeco.ffmpeg.avcodec.AVCodecParameters codecParameters, org.bytedeco.ffmpeg.avutil.AVRational timebase, int streamIndex)
      Add stream to the muxer. This method is called by direct muxing. For instance from RTMP, SRT ingest & Stream Pull to HLS, MP4, HLS, DASH WebRTC Muxing
      Parameters:
      codecParameters -
      timebase -
      streamIndex - , is the stream index of the source. Sometimes source and target stream index do not match
      Returns:
    • initAudioBitstreamFilter

      public org.bytedeco.ffmpeg.avcodec.AVBSFContext initAudioBitstreamFilter(String bsfAudioName, org.bytedeco.ffmpeg.avcodec.AVCodecParameters codecParameters, org.bytedeco.ffmpeg.avutil.AVRational timebase)
    • initVideoBitstreamFilter

      public org.bytedeco.ffmpeg.avcodec.AVBSFContext initVideoBitstreamFilter(String bsfVideoName, org.bytedeco.ffmpeg.avcodec.AVCodecParameters codecParameters, org.bytedeco.ffmpeg.avutil.AVRational timebase)
    • initBitstreamFilter

      private org.bytedeco.ffmpeg.avcodec.AVBSFContext initBitstreamFilter(String bsfVideoName, org.bytedeco.ffmpeg.avcodec.AVCodecParameters codecParameters, org.bytedeco.ffmpeg.avutil.AVRational timebase)
    • writeVideoBuffer

      public void writeVideoBuffer(ByteBuffer encodedVideoFrame, long dts, int frameRotation, int streamIndex, boolean isKeyFrame, long firstFrameTimeStamp, long pts)
    • writeVideoBuffer

      public void writeVideoBuffer(Muxer.VideoBuffer buffer)
    • writeAudioBuffer

      public void writeAudioBuffer(ByteBuffer audioFrame, int streamIndex, long timestamp)
    • getRegisteredStreamIndexList

      public List<Integer> getRegisteredStreamIndexList()
    • setIsRunning

      public void setIsRunning(AtomicBoolean isRunning)
    • setOption

      public void setOption(String optionName, String value)
    • getOptionDictionary

      public org.bytedeco.ffmpeg.avutil.AVDictionary getOptionDictionary()
    • isCodecSupported

      public abstract boolean isCodecSupported(int codecId)
    • getOutputFormatContext

      public abstract org.bytedeco.ffmpeg.avformat.AVFormatContext getOutputFormatContext()
    • checkToDropPacket

      public boolean checkToDropPacket(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, int codecType)
      Return decision about dropping packet or not
      Parameters:
      pkt -
      codecType -
      Returns:
      true to drop the packet, false to not drop packet
    • getVideoWidth

      public int getVideoWidth()
    • getVideoHeight

      public int getVideoHeight()
    • getAverageBitrate

      public long getAverageBitrate()
    • writePacket

      protected void writePacket(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, org.bytedeco.ffmpeg.avutil.AVRational inputTimebase, org.bytedeco.ffmpeg.avutil.AVRational outputTimebase, int codecType)
      All other writePacket functions call this function to make the job
      Parameters:
      pkt - Content of the data in AVPacket class
      inputTimebase - input time base is required to calculate the correct dts and pts values for the container
      outputTimebase - output time base is required to calculate the correct dts and pts values for the container
    • writeDataFrame

      public void writeDataFrame(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, org.bytedeco.ffmpeg.avformat.AVFormatContext context)
    • addExtradataIfRequired

      public void addExtradataIfRequired(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, boolean isKeyFrame)
    • writeVideoFrame

      protected void writeVideoFrame(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, org.bytedeco.ffmpeg.avformat.AVFormatContext context)
    • writeAudioFrame

      protected void writeAudioFrame(org.bytedeco.ffmpeg.avcodec.AVPacket pkt, org.bytedeco.ffmpeg.avutil.AVRational inputTimebase, org.bytedeco.ffmpeg.avutil.AVRational outputTimebase, org.bytedeco.ffmpeg.avformat.AVFormatContext context, long dts)
    • getDurationInMs

      public static long getDurationInMs(File f, String streamId)
    • getDurationInMs

      public static long getDurationInMs(String url, String streamId)
      Parameters:
      url -
      streamId -
      Returns:
      -1 if duration is not available in the stream -2 if input is not opened -3 if stream info is not found
    • getErrorDefinition

      public static String getErrorDefinition(int errorCode)
    • contextWillChange

      public void contextWillChange(org.bytedeco.ffmpeg.avcodec.AVCodecContext codecContext, int streamIndex)
      This is called when the current context will change/deleted soon. It's called by encoder and likely due to aspect ratio change After this method has been called, this method contextChanged(AVCodecContext, int) should be called
      Parameters:
      codecContext - the current context that will be changed/deleted soon
      streamIndex -
    • contextChanged

      public void contextChanged(org.bytedeco.ffmpeg.avcodec.AVCodecContext codecContext, int streamIndex)
      It's called when the codecContext for the stream index has changed. contextWillChange(AVCodecContext, int) is called before this method is called.
      Parameters:
      codecContext -
      streamIndex -
    • getInputTimeBaseMap

      public Map<Integer,org.bytedeco.ffmpeg.avutil.AVRational> getInputTimeBaseMap()
    • getTmpPacket

      public org.bytedeco.ffmpeg.avcodec.AVPacket getTmpPacket()
    • getIsRunning

      public AtomicBoolean getIsRunning()
    • getCurrentVoDTimeStamp

      public long getCurrentVoDTimeStamp()
    • setCurrentVoDTimeStamp

      public void setCurrentVoDTimeStamp(long currentVoDTimeStamp)
    • getResolution

      public int getResolution()
    • getLastPts

      public long getLastPts()
    • replaceDoubleSlashesWithSingleSlash

      public static String replaceDoubleSlashesWithSingleSlash(String url)
    • getVideoNotWrittenCount

      public long getVideoNotWrittenCount()
    • getAudioNotWrittenCount

      public long getAudioNotWrittenCount()
    • writeMetaData

      public void writeMetaData(String data, long dts)
    • getVideoCodecId

      public int getVideoCodecId()
    • getSubFolder

      public String getSubFolder()