summaryrefslogtreecommitdiffstats
path: root/Documentation/media
diff options
context:
space:
mode:
authorLinus Torvalds <torvalds@linux-foundation.org>2017-07-06 11:15:19 -0700
committerLinus Torvalds <torvalds@linux-foundation.org>2017-07-06 11:15:19 -0700
commit0b49ce5a40702bf78a5f80076312b244785e9a2f (patch)
tree1862fa8a30c3efbb539470af5fad1ee2fa8fe2e0 /Documentation/media
parent920f2ecdf6c3b3526f60fbd38c68597953cad3ee (diff)
parent2a2599c663684a1142dae0bff7737e125891ae6d (diff)
downloadop-kernel-dev-0b49ce5a40702bf78a5f80076312b244785e9a2f.zip
op-kernel-dev-0b49ce5a40702bf78a5f80076312b244785e9a2f.tar.gz
Merge tag 'media/v4.13-1' of git://git.kernel.org/pub/scm/linux/kernel/git/mchehab/linux-media
Pull media updates from Mauro Carvalho Chehab: - addition of fwnode support at V4L2 core - addition of a few more SDR formats - new imx driver to support i.MX6 cameras - new driver for Qualcon venus codecs - new I2C sensor drivers: dw9714, max2175, ov13858, ov5640 - new CEC driver: stm32-cec - some improvements to DVB frontend documentation and a few fixups - several driver improvements and fixups * tag 'media/v4.13-1' of git://git.kernel.org/pub/scm/linux/kernel/git/mchehab/linux-media: (361 commits) [media] media: entity: Catch unbalanced media_pipeline_stop calls [media] media/uapi/v4l: clarify cropcap/crop/selection behavior [media] v4l2-ioctl/exynos: fix G/S_SELECTION's type handling [media] vimc: sen: Declare vimc_sen_video_ops as static [media] vimc: sca: Add scaler [media] vimc: deb: Add debayer filter [media] vimc: Subdevices as modules [media] vimc: cap: Support several image formats [media] vimc: sen: Support several image formats [media] vimc: common: Add vimc_colorimetry_clamp [media] vimc: common: Add vimc_link_validate [media] vimc: common: Add vimc_pipeline_s_stream helper [media] vimc: common: Add vimc_ent_sd_* helper [media] vimc: Move common code from the core [media] vimc: sen: Integrate the tpg on the sensor [media] media: i2c: ov772x: Force use of SCCB protocol [media] dvb uapi docs: enums are passed by value, not reference [media] dvb: don't use 'time_t' in event ioctl [media] media: venus: enable building with COMPILE_TEST [media] af9013: refactor power control ...
Diffstat (limited to 'Documentation/media')
-rw-r--r--Documentation/media/kapi/cec-core.rst18
-rw-r--r--Documentation/media/kapi/v4l2-core.rst2
-rw-r--r--Documentation/media/kapi/v4l2-fwnode.rst3
-rw-r--r--Documentation/media/kapi/v4l2-of.rst3
-rw-r--r--Documentation/media/uapi/cec/cec-ioc-adap-g-caps.rst8
-rw-r--r--Documentation/media/uapi/dvb/fe-diseqc-send-burst.rst4
-rw-r--r--Documentation/media/uapi/dvb/fe-set-tone.rst4
-rw-r--r--Documentation/media/uapi/dvb/fe-set-voltage.rst7
-rw-r--r--Documentation/media/uapi/mediactl/media-ioc-g-topology.rst8
-rw-r--r--Documentation/media/uapi/mediactl/media-types.rst21
-rw-r--r--Documentation/media/uapi/v4l/control.rst6
-rw-r--r--Documentation/media/uapi/v4l/extended-controls.rst9
-rw-r--r--Documentation/media/uapi/v4l/pixfmt-sdr-pcu16be.rst55
-rw-r--r--Documentation/media/uapi/v4l/pixfmt-sdr-pcu18be.rst55
-rw-r--r--Documentation/media/uapi/v4l/pixfmt-sdr-pcu20be.rst54
-rw-r--r--Documentation/media/uapi/v4l/sdr-formats.rst3
-rw-r--r--Documentation/media/uapi/v4l/vidioc-cropcap.rst23
-rw-r--r--Documentation/media/uapi/v4l/vidioc-g-crop.rst22
-rw-r--r--Documentation/media/uapi/v4l/vidioc-g-selection.rst22
-rw-r--r--Documentation/media/v4l-drivers/imx.rst614
-rw-r--r--Documentation/media/v4l-drivers/index.rst1
-rw-r--r--Documentation/media/v4l-drivers/max2175.rst62
22 files changed, 957 insertions, 47 deletions
diff --git a/Documentation/media/kapi/cec-core.rst b/Documentation/media/kapi/cec-core.rst
index 7a04c53..8a65c69 100644
--- a/Documentation/media/kapi/cec-core.rst
+++ b/Documentation/media/kapi/cec-core.rst
@@ -194,6 +194,11 @@ When a transmit finished (successfully or otherwise):
void cec_transmit_done(struct cec_adapter *adap, u8 status, u8 arb_lost_cnt,
u8 nack_cnt, u8 low_drive_cnt, u8 error_cnt);
+or:
+
+.. c:function::
+ void cec_transmit_attempt_done(struct cec_adapter *adap, u8 status);
+
The status can be one of:
CEC_TX_STATUS_OK:
@@ -231,6 +236,11 @@ to 1, if the hardware does support retry then either set these counters to
0 if the hardware provides no feedback of which errors occurred and how many
times, or fill in the correct values as reported by the hardware.
+The cec_transmit_attempt_done() function is a helper for cases where the
+hardware never retries, so the transmit is always for just a single
+attempt. It will call cec_transmit_done() in turn, filling in 1 for the
+count argument corresponding to the status. Or all 0 if the status was OK.
+
When a CEC message was received:
.. c:function::
@@ -307,6 +317,14 @@ to another valid physical address, then this function will first set the
address to CEC_PHYS_ADDR_INVALID before enabling the new physical address.
.. c:function::
+ void cec_s_phys_addr_from_edid(struct cec_adapter *adap,
+ const struct edid *edid);
+
+A helper function that extracts the physical address from the edid struct
+and calls cec_s_phys_addr() with that address, or CEC_PHYS_ADDR_INVALID
+if the EDID did not contain a physical address or edid was a NULL pointer.
+
+.. c:function::
int cec_s_log_addrs(struct cec_adapter *adap,
struct cec_log_addrs *log_addrs, bool block);
diff --git a/Documentation/media/kapi/v4l2-core.rst b/Documentation/media/kapi/v4l2-core.rst
index d8f6c46..c7434f3 100644
--- a/Documentation/media/kapi/v4l2-core.rst
+++ b/Documentation/media/kapi/v4l2-core.rst
@@ -19,7 +19,7 @@ Video4Linux devices
v4l2-mc
v4l2-mediabus
v4l2-mem2mem
- v4l2-of
+ v4l2-fwnode
v4l2-rect
v4l2-tuner
v4l2-common
diff --git a/Documentation/media/kapi/v4l2-fwnode.rst b/Documentation/media/kapi/v4l2-fwnode.rst
new file mode 100644
index 0000000..6c8bccd
--- /dev/null
+++ b/Documentation/media/kapi/v4l2-fwnode.rst
@@ -0,0 +1,3 @@
+V4L2 fwnode kAPI
+^^^^^^^^^^^^^^^^
+.. kernel-doc:: include/media/v4l2-fwnode.h
diff --git a/Documentation/media/kapi/v4l2-of.rst b/Documentation/media/kapi/v4l2-of.rst
deleted file mode 100644
index 1ddf76b..0000000
--- a/Documentation/media/kapi/v4l2-of.rst
+++ /dev/null
@@ -1,3 +0,0 @@
-V4L2 Open Firmware kAPI
-^^^^^^^^^^^^^^^^^^^^^^^
-.. kernel-doc:: include/media/v4l2-of.h
diff --git a/Documentation/media/uapi/cec/cec-ioc-adap-g-caps.rst b/Documentation/media/uapi/cec/cec-ioc-adap-g-caps.rst
index a0e961f..6d7bf7b 100644
--- a/Documentation/media/uapi/cec/cec-ioc-adap-g-caps.rst
+++ b/Documentation/media/uapi/cec/cec-ioc-adap-g-caps.rst
@@ -113,6 +113,14 @@ returns the information to the application. The ioctl never fails.
- 0x00000020
- The CEC hardware can monitor all messages, not just directed and
broadcast messages.
+ * .. _`CEC-CAP-NEEDS-HPD`:
+
+ - ``CEC_CAP_NEEDS_HPD``
+ - 0x00000040
+ - The CEC hardware is only active if the HDMI Hotplug Detect pin is
+ high. This makes it impossible to use CEC to wake up displays that
+ set the HPD pin low when in standby mode, but keep the CEC bus
+ alive.
diff --git a/Documentation/media/uapi/dvb/fe-diseqc-send-burst.rst b/Documentation/media/uapi/dvb/fe-diseqc-send-burst.rst
index 26272f2..e962f6e 100644
--- a/Documentation/media/uapi/dvb/fe-diseqc-send-burst.rst
+++ b/Documentation/media/uapi/dvb/fe-diseqc-send-burst.rst
@@ -15,7 +15,7 @@ FE_DISEQC_SEND_BURST - Sends a 22KHz tone burst for 2x1 mini DiSEqC satellite se
Synopsis
========
-.. c:function:: int ioctl( int fd, FE_DISEQC_SEND_BURST, enum fe_sec_mini_cmd *tone )
+.. c:function:: int ioctl( int fd, FE_DISEQC_SEND_BURST, enum fe_sec_mini_cmd tone )
:name: FE_DISEQC_SEND_BURST
@@ -26,7 +26,7 @@ Arguments
File descriptor returned by :ref:`open() <frontend_f_open>`.
``tone``
- pointer to enum :c:type:`fe_sec_mini_cmd`
+ an integer enumered value described at :c:type:`fe_sec_mini_cmd`
Description
diff --git a/Documentation/media/uapi/dvb/fe-set-tone.rst b/Documentation/media/uapi/dvb/fe-set-tone.rst
index bea1932..84e4da3 100644
--- a/Documentation/media/uapi/dvb/fe-set-tone.rst
+++ b/Documentation/media/uapi/dvb/fe-set-tone.rst
@@ -15,7 +15,7 @@ FE_SET_TONE - Sets/resets the generation of the continuous 22kHz tone.
Synopsis
========
-.. c:function:: int ioctl( int fd, FE_SET_TONE, enum fe_sec_tone_mode *tone )
+.. c:function:: int ioctl( int fd, FE_SET_TONE, enum fe_sec_tone_mode tone )
:name: FE_SET_TONE
@@ -26,7 +26,7 @@ Arguments
File descriptor returned by :ref:`open() <frontend_f_open>`.
``tone``
- pointer to enum :c:type:`fe_sec_tone_mode`
+ an integer enumered value described at :c:type:`fe_sec_tone_mode`
Description
diff --git a/Documentation/media/uapi/dvb/fe-set-voltage.rst b/Documentation/media/uapi/dvb/fe-set-voltage.rst
index fcf6f38..052f316 100644
--- a/Documentation/media/uapi/dvb/fe-set-voltage.rst
+++ b/Documentation/media/uapi/dvb/fe-set-voltage.rst
@@ -15,7 +15,7 @@ FE_SET_VOLTAGE - Allow setting the DC level sent to the antenna subsystem.
Synopsis
========
-.. c:function:: int ioctl( int fd, FE_SET_VOLTAGE, enum fe_sec_voltage *voltage )
+.. c:function:: int ioctl( int fd, FE_SET_VOLTAGE, enum fe_sec_voltage voltage )
:name: FE_SET_VOLTAGE
@@ -26,10 +26,7 @@ Arguments
File descriptor returned by :ref:`open() <frontend_f_open>`.
``voltage``
- pointer to enum :c:type:`fe_sec_voltage`
-
- Valid values are described at enum
- :c:type:`fe_sec_voltage`.
+ an integer enumered value described at :c:type:`fe_sec_voltage`
Description
diff --git a/Documentation/media/uapi/mediactl/media-ioc-g-topology.rst b/Documentation/media/uapi/mediactl/media-ioc-g-topology.rst
index 48c9531..add8281 100644
--- a/Documentation/media/uapi/mediactl/media-ioc-g-topology.rst
+++ b/Documentation/media/uapi/mediactl/media-ioc-g-topology.rst
@@ -241,7 +241,7 @@ desired arrays with the media graph elements.
.. c:type:: media_v2_intf_devnode
-.. flat-table:: struct media_v2_interface
+.. flat-table:: struct media_v2_intf_devnode
:header-rows: 0
:stub-columns: 0
:widths: 1 2 8
@@ -312,7 +312,7 @@ desired arrays with the media graph elements.
.. c:type:: media_v2_link
-.. flat-table:: struct media_v2_pad
+.. flat-table:: struct media_v2_link
:header-rows: 0
:stub-columns: 0
:widths: 1 2 8
@@ -324,7 +324,7 @@ desired arrays with the media graph elements.
- ``id``
- - Unique ID for the pad.
+ - Unique ID for the link.
- .. row 2
@@ -334,7 +334,7 @@ desired arrays with the media graph elements.
- On pad to pad links: unique ID for the source pad.
- On interface to entity links: unique ID for the interface.
+ On interface to entity links: unique ID for the entity.
- .. row 3
diff --git a/Documentation/media/uapi/mediactl/media-types.rst b/Documentation/media/uapi/mediactl/media-types.rst
index 2a5164a..7107856 100644
--- a/Documentation/media/uapi/mediactl/media-types.rst
+++ b/Documentation/media/uapi/mediactl/media-types.rst
@@ -299,6 +299,27 @@ Types and flags used to represent the media graph elements
received on its sink pad and outputs the statistics data on
its source pad.
+ - .. row 29
+
+ .. _MEDIA-ENT-F-VID-MUX:
+
+ - ``MEDIA_ENT_F_VID_MUX``
+
+ - Video multiplexer. An entity capable of multiplexing must have at
+ least two sink pads and one source pad, and must pass the video
+ frame(s) received from the active sink pad to the source pad.
+
+ - .. row 30
+
+ .. _MEDIA-ENT-F-VID-IF-BRIDGE:
+
+ - ``MEDIA_ENT_F_VID_IF_BRIDGE``
+
+ - Video interface bridge. A video interface bridge entity must have at
+ least one sink pad and at least one source pad. It receives video
+ frames on its sink pad from an input video bus of one type (HDMI, eDP,
+ MIPI CSI-2, ...), and outputs them on its source pad to an output
+ video bus of another type (eDP, MIPI CSI-2, parallel, ...).
.. tabularcolumns:: |p{5.5cm}|p{12.0cm}|
diff --git a/Documentation/media/uapi/v4l/control.rst b/Documentation/media/uapi/v4l/control.rst
index 51112ba..c1e6adb 100644
--- a/Documentation/media/uapi/v4l/control.rst
+++ b/Documentation/media/uapi/v4l/control.rst
@@ -137,6 +137,12 @@ Control IDs
``V4L2_CID_GAIN`` ``(integer)``
Gain control.
+ Primarily used to control gain on e.g. TV tuners but also on
+ webcams. Most devices control only digital gain with this control
+ but on some this could include analogue gain as well. Devices that
+ recognise the difference between digital and analogue gain use
+ controls ``V4L2_CID_DIGITAL_GAIN`` and ``V4L2_CID_ANALOGUE_GAIN``.
+
``V4L2_CID_HFLIP`` ``(boolean)``
Mirror the picture horizontally.
diff --git a/Documentation/media/uapi/v4l/extended-controls.rst b/Documentation/media/uapi/v4l/extended-controls.rst
index abb1057..9acc9cad 100644
--- a/Documentation/media/uapi/v4l/extended-controls.rst
+++ b/Documentation/media/uapi/v4l/extended-controls.rst
@@ -2019,7 +2019,7 @@ enum v4l2_exposure_auto_type -
dynamically vary the frame rate. By default this feature is disabled
(0) and the frame rate must remain constant.
-``V4L2_CID_EXPOSURE_BIAS (integer menu)``
+``V4L2_CID_AUTO_EXPOSURE_BIAS (integer menu)``
Determines the automatic exposure compensation, it is effective only
when ``V4L2_CID_EXPOSURE_AUTO`` control is set to ``AUTO``,
``SHUTTER_PRIORITY`` or ``APERTURE_PRIORITY``. It is expressed in
@@ -3021,6 +3021,13 @@ Image Process Control IDs
The video deinterlacing mode (such as Bob, Weave, ...). The menu items are
driver specific and are documented in :ref:`v4l-drivers`.
+``V4L2_CID_DIGITAL_GAIN (integer)``
+ Digital gain is the value by which all colour components
+ are multiplied by. Typically the digital gain applied is the
+ control value divided by e.g. 0x100, meaning that to get no
+ digital gain the control value needs to be 0x100. The no-gain
+ configuration is also typically the default.
+
.. _dv-controls:
diff --git a/Documentation/media/uapi/v4l/pixfmt-sdr-pcu16be.rst b/Documentation/media/uapi/v4l/pixfmt-sdr-pcu16be.rst
new file mode 100644
index 0000000..2de1b1a
--- /dev/null
+++ b/Documentation/media/uapi/v4l/pixfmt-sdr-pcu16be.rst
@@ -0,0 +1,55 @@
+.. -*- coding: utf-8; mode: rst -*-
+
+.. _V4L2-SDR-FMT-PCU16BE:
+
+******************************
+V4L2_SDR_FMT_PCU16BE ('PC16')
+******************************
+
+Planar complex unsigned 16-bit big endian IQ sample
+
+Description
+===========
+
+This format contains a sequence of complex number samples. Each complex
+number consist of two parts called In-phase and Quadrature (IQ). Both I
+and Q are represented as a 16 bit unsigned big endian number stored in
+32 bit space. The remaining unused bits within the 32 bit space will be
+padded with 0. I value starts first and Q value starts at an offset
+equalling half of the buffer size (i.e.) offset = buffersize/2. Out of
+the 16 bits, bit 15:2 (14 bit) is data and bit 1:0 (2 bit) can be any
+value.
+
+**Byte Order.**
+Each cell is one byte.
+
+.. flat-table::
+ :header-rows: 1
+ :stub-columns: 0
+
+ * - Offset:
+ - Byte B0
+ - Byte B1
+ - Byte B2
+ - Byte B3
+ * - start + 0:
+ - I'\ :sub:`0[13:6]`
+ - I'\ :sub:`0[5:0]; B1[1:0]=pad`
+ - pad
+ - pad
+ * - start + 4:
+ - I'\ :sub:`1[13:6]`
+ - I'\ :sub:`1[5:0]; B1[1:0]=pad`
+ - pad
+ - pad
+ * - ...
+ * - start + offset:
+ - Q'\ :sub:`0[13:6]`
+ - Q'\ :sub:`0[5:0]; B1[1:0]=pad`
+ - pad
+ - pad
+ * - start + offset + 4:
+ - Q'\ :sub:`1[13:6]`
+ - Q'\ :sub:`1[5:0]; B1[1:0]=pad`
+ - pad
+ - pad
diff --git a/Documentation/media/uapi/v4l/pixfmt-sdr-pcu18be.rst b/Documentation/media/uapi/v4l/pixfmt-sdr-pcu18be.rst
new file mode 100644
index 0000000..da8b26b
--- /dev/null
+++ b/Documentation/media/uapi/v4l/pixfmt-sdr-pcu18be.rst
@@ -0,0 +1,55 @@
+.. -*- coding: utf-8; mode: rst -*-
+
+.. _V4L2-SDR-FMT-PCU18BE:
+
+******************************
+V4L2_SDR_FMT_PCU18BE ('PC18')
+******************************
+
+Planar complex unsigned 18-bit big endian IQ sample
+
+Description
+===========
+
+This format contains a sequence of complex number samples. Each complex
+number consist of two parts called In-phase and Quadrature (IQ). Both I
+and Q are represented as a 18 bit unsigned big endian number stored in
+32 bit space. The remaining unused bits within the 32 bit space will be
+padded with 0. I value starts first and Q value starts at an offset
+equalling half of the buffer size (i.e.) offset = buffersize/2. Out of
+the 18 bits, bit 17:2 (16 bit) is data and bit 1:0 (2 bit) can be any
+value.
+
+**Byte Order.**
+Each cell is one byte.
+
+.. flat-table::
+ :header-rows: 1
+ :stub-columns: 0
+
+ * - Offset:
+ - Byte B0
+ - Byte B1
+ - Byte B2
+ - Byte B3
+ * - start + 0:
+ - I'\ :sub:`0[17:10]`
+ - I'\ :sub:`0[9:2]`
+ - I'\ :sub:`0[1:0]; B2[5:0]=pad`
+ - pad
+ * - start + 4:
+ - I'\ :sub:`1[17:10]`
+ - I'\ :sub:`1[9:2]`
+ - I'\ :sub:`1[1:0]; B2[5:0]=pad`
+ - pad
+ * - ...
+ * - start + offset:
+ - Q'\ :sub:`0[17:10]`
+ - Q'\ :sub:`0[9:2]`
+ - Q'\ :sub:`0[1:0]; B2[5:0]=pad`
+ - pad
+ * - start + offset + 4:
+ - Q'\ :sub:`1[17:10]`
+ - Q'\ :sub:`1[9:2]`
+ - Q'\ :sub:`1[1:0]; B2[5:0]=pad`
+ - pad
diff --git a/Documentation/media/uapi/v4l/pixfmt-sdr-pcu20be.rst b/Documentation/media/uapi/v4l/pixfmt-sdr-pcu20be.rst
new file mode 100644
index 0000000..5499eed
--- /dev/null
+++ b/Documentation/media/uapi/v4l/pixfmt-sdr-pcu20be.rst
@@ -0,0 +1,54 @@
+.. -*- coding: utf-8; mode: rst -*-
+.. _V4L2-SDR-FMT-PCU20BE:
+
+******************************
+V4L2_SDR_FMT_PCU20BE ('PC20')
+******************************
+
+Planar complex unsigned 20-bit big endian IQ sample
+
+Description
+===========
+
+This format contains a sequence of complex number samples. Each complex
+number consist of two parts called In-phase and Quadrature (IQ). Both I
+and Q are represented as a 20 bit unsigned big endian number stored in
+32 bit space. The remaining unused bits within the 32 bit space will be
+padded with 0. I value starts first and Q value starts at an offset
+equalling half of the buffer size (i.e.) offset = buffersize/2. Out of
+the 20 bits, bit 19:2 (18 bit) is data and bit 1:0 (2 bit) can be any
+value.
+
+**Byte Order.**
+Each cell is one byte.
+
+.. flat-table::
+ :header-rows: 1
+ :stub-columns: 0
+
+ * - Offset:
+ - Byte B0
+ - Byte B1
+ - Byte B2
+ - Byte B3
+ * - start + 0:
+ - I'\ :sub:`0[19:12]`
+ - I'\ :sub:`0[11:4]`
+ - I'\ :sub:`0[3:0]; B2[3:0]=pad`
+ - pad
+ * - start + 4:
+ - I'\ :sub:`1[19:12]`
+ - I'\ :sub:`1[11:4]`
+ - I'\ :sub:`1[3:0]; B2[3:0]=pad`
+ - pad
+ * - ...
+ * - start + offset:
+ - Q'\ :sub:`0[19:12]`
+ - Q'\ :sub:`0[11:4]`
+ - Q'\ :sub:`0[3:0]; B2[3:0]=pad`
+ - pad
+ * - start + offset + 4:
+ - Q'\ :sub:`1[19:12]`
+ - Q'\ :sub:`1[11:4]`
+ - Q'\ :sub:`1[3:0]; B2[3:0]=pad`
+ - pad
diff --git a/Documentation/media/uapi/v4l/sdr-formats.rst b/Documentation/media/uapi/v4l/sdr-formats.rst
index f863c08..2037f5b 100644
--- a/Documentation/media/uapi/v4l/sdr-formats.rst
+++ b/Documentation/media/uapi/v4l/sdr-formats.rst
@@ -17,3 +17,6 @@ These formats are used for :ref:`SDR <sdr>` interface only.
pixfmt-sdr-cs08
pixfmt-sdr-cs14le
pixfmt-sdr-ru12le
+ pixfmt-sdr-pcu16be
+ pixfmt-sdr-pcu18be
+ pixfmt-sdr-pcu20be
diff --git a/Documentation/media/uapi/v4l/vidioc-cropcap.rst b/Documentation/media/uapi/v4l/vidioc-cropcap.rst
index f21a69b..0f80d5c 100644
--- a/Documentation/media/uapi/v4l/vidioc-cropcap.rst
+++ b/Documentation/media/uapi/v4l/vidioc-cropcap.rst
@@ -39,17 +39,10 @@ structure. Drivers fill the rest of the structure. The results are
constant except when switching the video standard. Remember this switch
can occur implicit when switching the video input or output.
-Do not use the multiplanar buffer types. Use
-``V4L2_BUF_TYPE_VIDEO_CAPTURE`` instead of
-``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE`` and use
-``V4L2_BUF_TYPE_VIDEO_OUTPUT`` instead of
-``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE``.
-
This ioctl must be implemented for video capture or output devices that
support cropping and/or scaling and/or have non-square pixels, and for
overlay devices.
-
.. c:type:: v4l2_cropcap
.. tabularcolumns:: |p{4.4cm}|p{4.4cm}|p{8.7cm}|
@@ -62,9 +55,9 @@ overlay devices.
* - __u32
- ``type``
- Type of the data stream, set by the application. Only these types
- are valid here: ``V4L2_BUF_TYPE_VIDEO_CAPTURE``,
- ``V4L2_BUF_TYPE_VIDEO_OUTPUT`` and
- ``V4L2_BUF_TYPE_VIDEO_OVERLAY``. See :c:type:`v4l2_buf_type`.
+ are valid here: ``V4L2_BUF_TYPE_VIDEO_CAPTURE``, ``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE``,
+ ``V4L2_BUF_TYPE_VIDEO_OUTPUT``, ``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE`` and
+ ``V4L2_BUF_TYPE_VIDEO_OVERLAY``. See :c:type:`v4l2_buf_type` and the note above.
* - struct :ref:`v4l2_rect <v4l2-rect-crop>`
- ``bounds``
- Defines the window within capturing or output is possible, this
@@ -90,6 +83,16 @@ overlay devices.
``pixelaspect`` to 1/1. Other common values are 54/59 for PAL and
SECAM, 11/10 for NTSC sampled according to [:ref:`itu601`].
+.. note::
+ Unfortunately in the case of multiplanar buffer types
+ (``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE`` and ``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE``)
+ this API was messed up with regards to how the :c:type:`v4l2_cropcap` ``type`` field
+ should be filled in. Some drivers only accepted the ``_MPLANE`` buffer type while
+ other drivers only accepted a non-multiplanar buffer type (i.e. without the
+ ``_MPLANE`` at the end).
+
+ Starting with kernel 4.13 both variations are allowed.
+
.. _v4l2-rect-crop:
diff --git a/Documentation/media/uapi/v4l/vidioc-g-crop.rst b/Documentation/media/uapi/v4l/vidioc-g-crop.rst
index 56a3634..13771ee 100644
--- a/Documentation/media/uapi/v4l/vidioc-g-crop.rst
+++ b/Documentation/media/uapi/v4l/vidioc-g-crop.rst
@@ -45,12 +45,6 @@ and struct :c:type:`v4l2_rect` substructure named ``c`` of a
v4l2_crop structure and call the :ref:`VIDIOC_S_CROP <VIDIOC_G_CROP>` ioctl with a pointer
to this structure.
-Do not use the multiplanar buffer types. Use
-``V4L2_BUF_TYPE_VIDEO_CAPTURE`` instead of
-``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE`` and use
-``V4L2_BUF_TYPE_VIDEO_OUTPUT`` instead of
-``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE``.
-
The driver first adjusts the requested dimensions against hardware
limits, i. e. the bounds given by the capture/output window, and it
rounds to the closest possible values of horizontal and vertical offset,
@@ -87,14 +81,24 @@ When cropping is not supported then no parameters are changed and
* - __u32
- ``type``
- Type of the data stream, set by the application. Only these types
- are valid here: ``V4L2_BUF_TYPE_VIDEO_CAPTURE``,
- ``V4L2_BUF_TYPE_VIDEO_OUTPUT`` and
- ``V4L2_BUF_TYPE_VIDEO_OVERLAY``. See :c:type:`v4l2_buf_type`.
+ are valid here: ``V4L2_BUF_TYPE_VIDEO_CAPTURE``, ``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE``,
+ ``V4L2_BUF_TYPE_VIDEO_OUTPUT``, ``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE`` and
+ ``V4L2_BUF_TYPE_VIDEO_OVERLAY``. See :c:type:`v4l2_buf_type` and the note above.
* - struct :c:type:`v4l2_rect`
- ``c``
- Cropping rectangle. The same co-ordinate system as for struct
:c:type:`v4l2_cropcap` ``bounds`` is used.
+.. note::
+ Unfortunately in the case of multiplanar buffer types
+ (``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE`` and ``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE``)
+ this API was messed up with regards to how the :c:type:`v4l2_crop` ``type`` field
+ should be filled in. Some drivers only accepted the ``_MPLANE`` buffer type while
+ other drivers only accepted a non-multiplanar buffer type (i.e. without the
+ ``_MPLANE`` at the end).
+
+ Starting with kernel 4.13 both variations are allowed.
+
Return Value
============
diff --git a/Documentation/media/uapi/v4l/vidioc-g-selection.rst b/Documentation/media/uapi/v4l/vidioc-g-selection.rst
index b80d85c..c1ee864 100644
--- a/Documentation/media/uapi/v4l/vidioc-g-selection.rst
+++ b/Documentation/media/uapi/v4l/vidioc-g-selection.rst
@@ -42,11 +42,7 @@ The ioctls are used to query and configure selection rectangles.
To query the cropping (composing) rectangle set struct
:c:type:`v4l2_selection` ``type`` field to the
-respective buffer type. Do not use the multiplanar buffer types. Use
-``V4L2_BUF_TYPE_VIDEO_CAPTURE`` instead of
-``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE`` and use
-``V4L2_BUF_TYPE_VIDEO_OUTPUT`` instead of
-``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE``. The next step is setting the
+respective buffer type. The next step is setting the
value of struct :c:type:`v4l2_selection` ``target``
field to ``V4L2_SEL_TGT_CROP`` (``V4L2_SEL_TGT_COMPOSE``). Please refer
to table :ref:`v4l2-selections-common` or :ref:`selection-api` for
@@ -64,11 +60,7 @@ pixels.
To change the cropping (composing) rectangle set the struct
:c:type:`v4l2_selection` ``type`` field to the
-respective buffer type. Do not use multiplanar buffers. Use
-``V4L2_BUF_TYPE_VIDEO_CAPTURE`` instead of
-``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE``. Use
-``V4L2_BUF_TYPE_VIDEO_OUTPUT`` instead of
-``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE``. The next step is setting the
+respective buffer type. The next step is setting the
value of struct :c:type:`v4l2_selection` ``target`` to
``V4L2_SEL_TGT_CROP`` (``V4L2_SEL_TGT_COMPOSE``). Please refer to table
:ref:`v4l2-selections-common` or :ref:`selection-api` for additional
@@ -169,6 +161,16 @@ Selection targets and flags are documented in
- Reserved fields for future use. Drivers and applications must zero
this array.
+.. note::
+ Unfortunately in the case of multiplanar buffer types
+ (``V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE`` and ``V4L2_BUF_TYPE_VIDEO_OUTPUT_MPLANE``)
+ this API was messed up with regards to how the :c:type:`v4l2_selection` ``type`` field
+ should be filled in. Some drivers only accepted the ``_MPLANE`` buffer type while
+ other drivers only accepted a non-multiplanar buffer type (i.e. without the
+ ``_MPLANE`` at the end).
+
+ Starting with kernel 4.13 both variations are allowed.
+
Return Value
============
diff --git a/Documentation/media/v4l-drivers/imx.rst b/Documentation/media/v4l-drivers/imx.rst
new file mode 100644
index 0000000..e0ee0f1
--- /dev/null
+++ b/Documentation/media/v4l-drivers/imx.rst
@@ -0,0 +1,614 @@
+i.MX Video Capture Driver
+=========================
+
+Introduction
+------------
+
+The Freescale i.MX5/6 contains an Image Processing Unit (IPU), which
+handles the flow of image frames to and from capture devices and
+display devices.
+
+For image capture, the IPU contains the following internal subunits:
+
+- Image DMA Controller (IDMAC)
+- Camera Serial Interface (CSI)
+- Image Converter (IC)
+- Sensor Multi-FIFO Controller (SMFC)
+- Image Rotator (IRT)
+- Video De-Interlacing or Combining Block (VDIC)
+
+The IDMAC is the DMA controller for transfer of image frames to and from
+memory. Various dedicated DMA channels exist for both video capture and
+display paths. During transfer, the IDMAC is also capable of vertical
+image flip, 8x8 block transfer (see IRT description), pixel component
+re-ordering (for example UYVY to YUYV) within the same colorspace, and
+even packed <--> planar conversion. It can also perform a simple
+de-interlacing by interleaving even and odd lines during transfer
+(without motion compensation which requires the VDIC).
+
+The CSI is the backend capture unit that interfaces directly with
+camera sensors over Parallel, BT.656/1120, and MIPI CSI-2 busses.
+
+The IC handles color-space conversion, resizing (downscaling and
+upscaling), horizontal flip, and 90/270 degree rotation operations.
+
+There are three independent "tasks" within the IC that can carry out
+conversions concurrently: pre-process encoding, pre-process viewfinder,
+and post-processing. Within each task, conversions are split into three
+sections: downsizing section, main section (upsizing, flip, colorspace
+conversion, and graphics plane combining), and rotation section.
+
+The IPU time-shares the IC task operations. The time-slice granularity
+is one burst of eight pixels in the downsizing section, one image line
+in the main processing section, one image frame in the rotation section.
+
+The SMFC is composed of four independent FIFOs that each can transfer
+captured frames from sensors directly to memory concurrently via four
+IDMAC channels.
+
+The IRT carries out 90 and 270 degree image rotation operations. The
+rotation operation is carried out on 8x8 pixel blocks at a time. This
+operation is supported by the IDMAC which handles the 8x8 block transfer
+along with block reordering, in coordination with vertical flip.
+
+The VDIC handles the conversion of interlaced video to progressive, with
+support for different motion compensation modes (low, medium, and high
+motion). The deinterlaced output frames from the VDIC can be sent to the
+IC pre-process viewfinder task for further conversions. The VDIC also
+contains a Combiner that combines two image planes, with alpha blending
+and color keying.
+
+In addition to the IPU internal subunits, there are also two units
+outside the IPU that are also involved in video capture on i.MX:
+
+- MIPI CSI-2 Receiver for camera sensors with the MIPI CSI-2 bus
+ interface. This is a Synopsys DesignWare core.
+- Two video multiplexers for selecting among multiple sensor inputs
+ to send to a CSI.
+
+For more info, refer to the latest versions of the i.MX5/6 reference
+manuals [#f1]_ and [#f2]_.
+
+
+Features
+--------
+
+Some of the features of this driver include:
+
+- Many different pipelines can be configured via media controller API,
+ that correspond to the hardware video capture pipelines supported in
+ the i.MX.
+
+- Supports parallel, BT.565, and MIPI CSI-2 interfaces.
+
+- Concurrent independent streams, by configuring pipelines to multiple
+ video capture interfaces using independent entities.
+
+- Scaling, color-space conversion, horizontal and vertical flip, and
+ image rotation via IC task subdevs.
+
+- Many pixel formats supported (RGB, packed and planar YUV, partial
+ planar YUV).
+
+- The VDIC subdev supports motion compensated de-interlacing, with three
+ motion compensation modes: low, medium, and high motion. Pipelines are
+ defined that allow sending frames to the VDIC subdev directly from the
+ CSI. There is also support in the future for sending frames to the
+ VDIC from memory buffers via a output/mem2mem devices.
+
+- Includes a Frame Interval Monitor (FIM) that can correct vertical sync
+ problems with the ADV718x video decoders.
+
+
+Entities
+--------
+
+imx6-mipi-csi2
+--------------
+
+This is the MIPI CSI-2 receiver entity. It has one sink pad to receive
+the MIPI CSI-2 stream (usually from a MIPI CSI-2 camera sensor). It has
+four source pads, corresponding to the four MIPI CSI-2 demuxed virtual
+channel outputs. Multpiple source pads can be enabled to independently
+stream from multiple virtual channels.
+
+This entity actually consists of two sub-blocks. One is the MIPI CSI-2
+core. This is a Synopsys Designware MIPI CSI-2 core. The other sub-block
+is a "CSI-2 to IPU gasket". The gasket acts as a demultiplexer of the
+four virtual channels streams, providing four separate parallel buses
+containing each virtual channel that are routed to CSIs or video
+multiplexers as described below.
+
+On i.MX6 solo/dual-lite, all four virtual channel buses are routed to
+two video multiplexers. Both CSI0 and CSI1 can receive any virtual
+channel, as selected by the video multiplexers.
+
+On i.MX6 Quad, virtual channel 0 is routed to IPU1-CSI0 (after selected
+by a video mux), virtual channels 1 and 2 are hard-wired to IPU1-CSI1
+and IPU2-CSI0, respectively, and virtual channel 3 is routed to
+IPU2-CSI1 (again selected by a video mux).
+
+ipuX_csiY_mux
+-------------
+
+These are the video multiplexers. They have two or more sink pads to
+select from either camera sensors with a parallel interface, or from
+MIPI CSI-2 virtual channels from imx6-mipi-csi2 entity. They have a
+single source pad that routes to a CSI (ipuX_csiY entities).
+
+On i.MX6 solo/dual-lite, there are two video mux entities. One sits
+in front of IPU1-CSI0 to select between a parallel sensor and any of
+the four MIPI CSI-2 virtual channels (a total of five sink pads). The
+other mux sits in front of IPU1-CSI1, and again has five sink pads to
+select between a parallel sensor and any of the four MIPI CSI-2 virtual
+channels.
+
+On i.MX6 Quad, there are two video mux entities. One sits in front of
+IPU1-CSI0 to select between a parallel sensor and MIPI CSI-2 virtual
+channel 0 (two sink pads). The other mux sits in front of IPU2-CSI1 to
+select between a parallel sensor and MIPI CSI-2 virtual channel 3 (two
+sink pads).
+
+ipuX_csiY
+---------
+
+These are the CSI entities. They have a single sink pad receiving from
+either a video mux or from a MIPI CSI-2 virtual channel as described
+above.
+
+This entity has two source pads. The first source pad can link directly
+to the ipuX_vdic entity or the ipuX_ic_prp entity, using hardware links
+that require no IDMAC memory buffer transfer.
+
+When the direct source pad is routed to the ipuX_ic_prp entity, frames
+from the CSI can be processed by one or both of the IC pre-processing
+tasks.
+
+When the direct source pad is routed to the ipuX_vdic entity, the VDIC
+will carry out motion-compensated de-interlace using "high motion" mode
+(see description of ipuX_vdic entity).
+
+The second source pad sends video frames directly to memory buffers
+via the SMFC and an IDMAC channel, bypassing IC pre-processing. This
+source pad is routed to a capture device node, with a node name of the
+format "ipuX_csiY capture".
+
+Note that since the IDMAC source pad makes use of an IDMAC channel, it
+can do pixel reordering within the same colorspace. For example, the
+sink pad can take UYVY2X8, but the IDMAC source pad can output YUYV2X8.
+If the sink pad is receiving YUV, the output at the capture device can
+also be converted to a planar YUV format such as YUV420.
+
+It will also perform simple de-interlace without motion compensation,
+which is activated if the sink pad's field type is an interlaced type,
+and the IDMAC source pad field type is set to none.
+
+This subdev can generate the following event when enabling the second
+IDMAC source pad:
+
+- V4L2_EVENT_IMX_FRAME_INTERVAL_ERROR
+
+The user application can subscribe to this event from the ipuX_csiY
+subdev node. This event is generated by the Frame Interval Monitor
+(see below for more on the FIM).
+
+Cropping in ipuX_csiY
+---------------------
+
+The CSI supports cropping the incoming raw sensor frames. This is
+implemented in the ipuX_csiY entities at the sink pad, using the
+crop selection subdev API.
+
+The CSI also supports fixed divide-by-two downscaling indepently in
+width and height. This is implemented in the ipuX_csiY entities at
+the sink pad, using the compose selection subdev API.
+
+The output rectangle at the ipuX_csiY source pad is the same as
+the compose rectangle at the sink pad. So the source pad rectangle
+cannot be negotiated, it must be set using the compose selection
+API at sink pad (if /2 downscale is desired, otherwise source pad
+rectangle is equal to incoming rectangle).
+
+To give an example of crop and /2 downscale, this will crop a
+1280x960 input frame to 640x480, and then /2 downscale in both
+dimensions to 320x240 (assumes ipu1_csi0 is linked to ipu1_csi0_mux):
+
+media-ctl -V "'ipu1_csi0_mux':2[fmt:UYVY2X8/1280x960]"
+media-ctl -V "'ipu1_csi0':0[crop:(0,0)/640x480]"
+media-ctl -V "'ipu1_csi0':0[compose:(0,0)/320x240]"
+
+Frame Skipping in ipuX_csiY
+---------------------------
+
+The CSI supports frame rate decimation, via frame skipping. Frame
+rate decimation is specified by setting the frame intervals at
+sink and source pads. The ipuX_csiY entity then applies the best
+frame skip setting to the CSI to achieve the desired frame rate
+at the source pad.
+
+The following example reduces an assumed incoming 60 Hz frame
+rate by half at the IDMAC output source pad:
+
+media-ctl -V "'ipu1_csi0':0[fmt:UYVY2X8/640x480@1/60]"
+media-ctl -V "'ipu1_csi0':2[fmt:UYVY2X8/640x480@1/30]"
+
+Frame Interval Monitor in ipuX_csiY
+-----------------------------------
+
+The adv718x decoders can occasionally send corrupt fields during
+NTSC/PAL signal re-sync (too little or too many video lines). When
+this happens, the IPU triggers a mechanism to re-establish vertical
+sync by adding 1 dummy line every frame, which causes a rolling effect
+from image to image, and can last a long time before a stable image is
+recovered. Or sometimes the mechanism doesn't work at all, causing a
+permanent split image (one frame contains lines from two consecutive
+captured images).
+
+From experiment it was found that during image rolling, the frame
+intervals (elapsed time between two EOF's) drop below the nominal
+value for the current standard, by about one frame time (60 usec),
+and remain at that value until rolling stops.
+
+While the reason for this observation isn't known (the IPU dummy
+line mechanism should show an increase in the intervals by 1 line
+time every frame, not a fixed value), we can use it to detect the
+corrupt fields using a frame interval monitor. If the FIM detects a
+bad frame interval, the ipuX_csiY subdev will send the event
+V4L2_EVENT_IMX_FRAME_INTERVAL_ERROR. Userland can register with
+the FIM event notification on the ipuX_csiY subdev device node.
+Userland can issue a streaming restart when this event is received
+to correct the rolling/split image.
+
+The ipuX_csiY subdev includes custom controls to tweak some dials for
+FIM. If one of these controls is changed during streaming, the FIM will
+be reset and will continue at the new settings.
+
+- V4L2_CID_IMX_FIM_ENABLE
+
+Enable/disable the FIM.
+
+- V4L2_CID_IMX_FIM_NUM
+
+How many frame interval measurements to average before comparing against
+the nominal frame interval reported by the sensor. This can reduce noise
+caused by interrupt latency.
+
+- V4L2_CID_IMX_FIM_TOLERANCE_MIN
+
+If the averaged intervals fall outside nominal by this amount, in
+microseconds, the V4L2_EVENT_IMX_FRAME_INTERVAL_ERROR event is sent.
+
+- V4L2_CID_IMX_FIM_TOLERANCE_MAX
+
+If any intervals are higher than this value, those samples are
+discarded and do not enter into the average. This can be used to
+discard really high interval errors that might be due to interrupt
+latency from high system load.
+
+- V4L2_CID_IMX_FIM_NUM_SKIP
+
+How many frames to skip after a FIM reset or stream restart before
+FIM begins to average intervals.
+
+- V4L2_CID_IMX_FIM_ICAP_CHANNEL
+- V4L2_CID_IMX_FIM_ICAP_EDGE
+
+These controls will configure an input capture channel as the method
+for measuring frame intervals. This is superior to the default method
+of measuring frame intervals via EOF interrupt, since it is not subject
+to uncertainty errors introduced by interrupt latency.
+
+Input capture requires hardware support. A VSYNC signal must be routed
+to one of the i.MX6 input capture channel pads.
+
+V4L2_CID_IMX_FIM_ICAP_CHANNEL configures which i.MX6 input capture
+channel to use. This must be 0 or 1.
+
+V4L2_CID_IMX_FIM_ICAP_EDGE configures which signal edge will trigger
+input capture events. By default the input capture method is disabled
+with a value of IRQ_TYPE_NONE. Set this control to IRQ_TYPE_EDGE_RISING,
+IRQ_TYPE_EDGE_FALLING, or IRQ_TYPE_EDGE_BOTH to enable input capture,
+triggered on the given signal edge(s).
+
+When input capture is disabled, frame intervals will be measured via
+EOF interrupt.
+
+
+ipuX_vdic
+---------
+
+The VDIC carries out motion compensated de-interlacing, with three
+motion compensation modes: low, medium, and high motion. The mode is
+specified with the menu control V4L2_CID_DEINTERLACING_MODE. It has
+two sink pads and a single source pad.
+
+The direct sink pad receives from an ipuX_csiY direct pad. With this
+link the VDIC can only operate in high motion mode.
+
+When the IDMAC sink pad is activated, it receives from an output
+or mem2mem device node. With this pipeline, it can also operate
+in low and medium modes, because these modes require receiving
+frames from memory buffers. Note that an output or mem2mem device
+is not implemented yet, so this sink pad currently has no links.
+
+The source pad routes to the IC pre-processing entity ipuX_ic_prp.
+
+ipuX_ic_prp
+-----------
+
+This is the IC pre-processing entity. It acts as a router, routing
+data from its sink pad to one or both of its source pads.
+
+It has a single sink pad. The sink pad can receive from the ipuX_csiY
+direct pad, or from ipuX_vdic.
+
+This entity has two source pads. One source pad routes to the
+pre-process encode task entity (ipuX_ic_prpenc), the other to the
+pre-process viewfinder task entity (ipuX_ic_prpvf). Both source pads
+can be activated at the same time if the sink pad is receiving from
+ipuX_csiY. Only the source pad to the pre-process viewfinder task entity
+can be activated if the sink pad is receiving from ipuX_vdic (frames
+from the VDIC can only be processed by the pre-process viewfinder task).
+
+ipuX_ic_prpenc
+--------------
+
+This is the IC pre-processing encode entity. It has a single sink
+pad from ipuX_ic_prp, and a single source pad. The source pad is
+routed to a capture device node, with a node name of the format
+"ipuX_ic_prpenc capture".
+
+This entity performs the IC pre-process encode task operations:
+color-space conversion, resizing (downscaling and upscaling),
+horizontal and vertical flip, and 90/270 degree rotation. Flip
+and rotation are provided via standard V4L2 controls.
+
+Like the ipuX_csiY IDMAC source, it can also perform simple de-interlace
+without motion compensation, and pixel reordering.
+
+ipuX_ic_prpvf
+-------------
+
+This is the IC pre-processing viewfinder entity. It has a single sink
+pad from ipuX_ic_prp, and a single source pad. The source pad is routed
+to a capture device node, with a node name of the format
+"ipuX_ic_prpvf capture".
+
+It is identical in operation to ipuX_ic_prpenc, with the same resizing
+and CSC operations and flip/rotation controls. It will receive and
+process de-interlaced frames from the ipuX_vdic if ipuX_ic_prp is
+receiving from ipuX_vdic.
+
+Like the ipuX_csiY IDMAC source, it can perform simple de-interlace
+without motion compensation. However, note that if the ipuX_vdic is
+included in the pipeline (ipuX_ic_prp is receiving from ipuX_vdic),
+it's not possible to use simple de-interlace in ipuX_ic_prpvf, since
+the ipuX_vdic has already carried out de-interlacing (with motion
+compensation) and therefore the field type output from ipuX_ic_prp can
+only be none.
+
+Capture Pipelines
+-----------------
+
+The following describe the various use-cases supported by the pipelines.
+
+The links shown do not include the backend sensor, video mux, or mipi
+csi-2 receiver links. This depends on the type of sensor interface
+(parallel or mipi csi-2). So these pipelines begin with:
+
+sensor -> ipuX_csiY_mux -> ...
+
+for parallel sensors, or:
+
+sensor -> imx6-mipi-csi2 -> (ipuX_csiY_mux) -> ...
+
+for mipi csi-2 sensors. The imx6-mipi-csi2 receiver may need to route
+to the video mux (ipuX_csiY_mux) before sending to the CSI, depending
+on the mipi csi-2 virtual channel, hence ipuX_csiY_mux is shown in
+parenthesis.
+
+Unprocessed Video Capture:
+--------------------------
+
+Send frames directly from sensor to camera device interface node, with
+no conversions, via ipuX_csiY IDMAC source pad:
+
+-> ipuX_csiY:2 -> ipuX_csiY capture
+
+IC Direct Conversions:
+----------------------
+
+This pipeline uses the preprocess encode entity to route frames directly
+from the CSI to the IC, to carry out scaling up to 1024x1024 resolution,
+CSC, flipping, and image rotation:
+
+-> ipuX_csiY:1 -> 0:ipuX_ic_prp:1 -> 0:ipuX_ic_prpenc:1 ->
+ ipuX_ic_prpenc capture
+
+Motion Compensated De-interlace:
+--------------------------------
+
+This pipeline routes frames from the CSI direct pad to the VDIC entity to
+support motion-compensated de-interlacing (high motion mode only),
+scaling up to 1024x1024, CSC, flip, and rotation:
+
+-> ipuX_csiY:1 -> 0:ipuX_vdic:2 -> 0:ipuX_ic_prp:2 ->
+ 0:ipuX_ic_prpvf:1 -> ipuX_ic_prpvf capture
+
+
+Usage Notes
+-----------
+
+To aid in configuration and for backward compatibility with V4L2
+applications that access controls only from video device nodes, the
+capture device interfaces inherit controls from the active entities
+in the current pipeline, so controls can be accessed either directly
+from the subdev or from the active capture device interface. For
+example, the FIM controls are available either from the ipuX_csiY
+subdevs or from the active capture device.
+
+The following are specific usage notes for the Sabre* reference
+boards:
+
+
+SabreLite with OV5642 and OV5640
+--------------------------------
+
+This platform requires the OmniVision OV5642 module with a parallel
+camera interface, and the OV5640 module with a MIPI CSI-2
+interface. Both modules are available from Boundary Devices:
+
+https://boundarydevices.com/product/nit6x_5mp
+https://boundarydevices.com/product/nit6x_5mp_mipi
+
+Note that if only one camera module is available, the other sensor
+node can be disabled in the device tree.
+
+The OV5642 module is connected to the parallel bus input on the i.MX
+internal video mux to IPU1 CSI0. It's i2c bus connects to i2c bus 2.
+
+The MIPI CSI-2 OV5640 module is connected to the i.MX internal MIPI CSI-2
+receiver, and the four virtual channel outputs from the receiver are
+routed as follows: vc0 to the IPU1 CSI0 mux, vc1 directly to IPU1 CSI1,
+vc2 directly to IPU2 CSI0, and vc3 to the IPU2 CSI1 mux. The OV5640 is
+also connected to i2c bus 2 on the SabreLite, therefore the OV5642 and
+OV5640 must not share the same i2c slave address.
+
+The following basic example configures unprocessed video capture
+pipelines for both sensors. The OV5642 is routed to ipu1_csi0, and
+the OV5640, transmitting on MIPI CSI-2 virtual channel 1 (which is
+imx6-mipi-csi2 pad 2), is routed to ipu1_csi1. Both sensors are
+configured to output 640x480, and the OV5642 outputs YUYV2X8, the
+OV5640 UYVY2X8:
+
+.. code-block:: none
+
+ # Setup links for OV5642
+ media-ctl -l "'ov5642 1-0042':0 -> 'ipu1_csi0_mux':1[1]"
+ media-ctl -l "'ipu1_csi0_mux':2 -> 'ipu1_csi0':0[1]"
+ media-ctl -l "'ipu1_csi0':2 -> 'ipu1_csi0 capture':0[1]"
+ # Setup links for OV5640
+ media-ctl -l "'ov5640 1-0040':0 -> 'imx6-mipi-csi2':0[1]"
+ media-ctl -l "'imx6-mipi-csi2':2 -> 'ipu1_csi1':0[1]"
+ media-ctl -l "'ipu1_csi1':2 -> 'ipu1_csi1 capture':0[1]"
+ # Configure pads for OV5642 pipeline
+ media-ctl -V "'ov5642 1-0042':0 [fmt:YUYV2X8/640x480 field:none]"
+ media-ctl -V "'ipu1_csi0_mux':2 [fmt:YUYV2X8/640x480 field:none]"
+ media-ctl -V "'ipu1_csi0':2 [fmt:AYUV32/640x480 field:none]"
+ # Configure pads for OV5640 pipeline
+ media-ctl -V "'ov5640 1-0040':0 [fmt:UYVY2X8/640x480 field:none]"
+ media-ctl -V "'imx6-mipi-csi2':2 [fmt:UYVY2X8/640x480 field:none]"
+ media-ctl -V "'ipu1_csi1':2 [fmt:AYUV32/640x480 field:none]"
+
+Streaming can then begin independently on the capture device nodes
+"ipu1_csi0 capture" and "ipu1_csi1 capture". The v4l2-ctl tool can
+be used to select any supported YUV pixelformat on the capture device
+nodes, including planar.
+
+SabreAuto with ADV7180 decoder
+------------------------------
+
+On the SabreAuto, an on-board ADV7180 SD decoder is connected to the
+parallel bus input on the internal video mux to IPU1 CSI0.
+
+The following example configures a pipeline to capture from the ADV7180
+video decoder, assuming NTSC 720x480 input signals, with Motion
+Compensated de-interlacing. Pad field types assume the adv7180 outputs
+"interlaced". $outputfmt can be any format supported by the ipu1_ic_prpvf
+entity at its output pad:
+
+.. code-block:: none
+
+ # Setup links
+ media-ctl -l "'adv7180 3-0021':0 -> 'ipu1_csi0_mux':1[1]"
+ media-ctl -l "'ipu1_csi0_mux':2 -> 'ipu1_csi0':0[1]"
+ media-ctl -l "'ipu1_csi0':1 -> 'ipu1_vdic':0[1]"
+ media-ctl -l "'ipu1_vdic':2 -> 'ipu1_ic_prp':0[1]"
+ media-ctl -l "'ipu1_ic_prp':2 -> 'ipu1_ic_prpvf':0[1]"
+ media-ctl -l "'ipu1_ic_prpvf':1 -> 'ipu1_ic_prpvf capture':0[1]"
+ # Configure pads
+ media-ctl -V "'adv7180 3-0021':0 [fmt:UYVY2X8/720x480]"
+ media-ctl -V "'ipu1_csi0_mux':2 [fmt:UYVY2X8/720x480 field:interlaced]"
+ media-ctl -V "'ipu1_csi0':1 [fmt:AYUV32/720x480 field:interlaced]"
+ media-ctl -V "'ipu1_vdic':2 [fmt:AYUV32/720x480 field:none]"
+ media-ctl -V "'ipu1_ic_prp':2 [fmt:AYUV32/720x480 field:none]"
+ media-ctl -V "'ipu1_ic_prpvf':1 [fmt:$outputfmt field:none]"
+
+Streaming can then begin on the capture device node at
+"ipu1_ic_prpvf capture". The v4l2-ctl tool can be used to select any
+supported YUV or RGB pixelformat on the capture device node.
+
+This platform accepts Composite Video analog inputs to the ADV7180 on
+Ain1 (connector J42).
+
+SabreSD with MIPI CSI-2 OV5640
+------------------------------
+
+Similarly to SabreLite, the SabreSD supports a parallel interface
+OV5642 module on IPU1 CSI0, and a MIPI CSI-2 OV5640 module. The OV5642
+connects to i2c bus 1 and the OV5640 to i2c bus 2.
+
+The device tree for SabreSD includes OF graphs for both the parallel
+OV5642 and the MIPI CSI-2 OV5640, but as of this writing only the MIPI
+CSI-2 OV5640 has been tested, so the OV5642 node is currently disabled.
+The OV5640 module connects to MIPI connector J5 (sorry I don't have the
+compatible module part number or URL).
+
+The following example configures a direct conversion pipeline to capture
+from the OV5640, transmitting on MIPI CSI-2 virtual channel 1. $sensorfmt
+can be any format supported by the OV5640. $sensordim is the frame
+dimension part of $sensorfmt (minus the mbus pixel code). $outputfmt can
+be any format supported by the ipu1_ic_prpenc entity at its output pad:
+
+.. code-block:: none
+
+ # Setup links
+ media-ctl -l "'ov5640 1-003c':0 -> 'imx6-mipi-csi2':0[1]"
+ media-ctl -l "'imx6-mipi-csi2':2 -> 'ipu1_csi1':0[1]"
+ media-ctl -l "'ipu1_csi1':1 -> 'ipu1_ic_prp':0[1]"
+ media-ctl -l "'ipu1_ic_prp':1 -> 'ipu1_ic_prpenc':0[1]"
+ media-ctl -l "'ipu1_ic_prpenc':1 -> 'ipu1_ic_prpenc capture':0[1]"
+ # Configure pads
+ media-ctl -V "'ov5640 1-003c':0 [fmt:$sensorfmt field:none]"
+ media-ctl -V "'imx6-mipi-csi2':2 [fmt:$sensorfmt field:none]"
+ media-ctl -V "'ipu1_csi1':1 [fmt:AYUV32/$sensordim field:none]"
+ media-ctl -V "'ipu1_ic_prp':1 [fmt:AYUV32/$sensordim field:none]"
+ media-ctl -V "'ipu1_ic_prpenc':1 [fmt:$outputfmt field:none]"
+
+Streaming can then begin on "ipu1_ic_prpenc capture" node. The v4l2-ctl
+tool can be used to select any supported YUV or RGB pixelformat on the
+capture device node.
+
+
+Known Issues
+------------
+
+1. When using 90 or 270 degree rotation control at capture resolutions
+ near the IC resizer limit of 1024x1024, and combined with planar
+ pixel formats (YUV420, YUV422p), frame capture will often fail with
+ no end-of-frame interrupts from the IDMAC channel. To work around
+ this, use lower resolution and/or packed formats (YUYV, RGB3, etc.)
+ when 90 or 270 rotations are needed.
+
+
+File list
+---------
+
+drivers/staging/media/imx/
+include/media/imx.h
+include/linux/imx-media.h
+
+References
+----------
+
+.. [#f1] http://www.nxp.com/assets/documents/data/en/reference-manuals/IMX6DQRM.pdf
+.. [#f2] http://www.nxp.com/assets/documents/data/en/reference-manuals/IMX6SDLRM.pdf
+
+
+Authors
+-------
+Steve Longerbeam <steve_longerbeam@mentor.com>
+Philipp Zabel <kernel@pengutronix.de>
+Russell King <linux@armlinux.org.uk>
+
+Copyright (C) 2012-2017 Mentor Graphics Inc.
diff --git a/Documentation/media/v4l-drivers/index.rst b/Documentation/media/v4l-drivers/index.rst
index 90fe22a..2e24d68 100644
--- a/Documentation/media/v4l-drivers/index.rst
+++ b/Documentation/media/v4l-drivers/index.rst
@@ -42,6 +42,7 @@ For more details see the file COPYING in the source distribution of Linux.
davinci-vpbe
fimc
ivtv
+ max2175
meye
omap3isp
omap4_camera
diff --git a/Documentation/media/v4l-drivers/max2175.rst b/Documentation/media/v4l-drivers/max2175.rst
new file mode 100644
index 0000000..04478c2
--- /dev/null
+++ b/Documentation/media/v4l-drivers/max2175.rst
@@ -0,0 +1,62 @@
+Maxim Integrated MAX2175 RF to bits tuner driver
+================================================
+
+The MAX2175 driver implements the following driver-specific controls:
+
+``V4L2_CID_MAX2175_I2S_ENABLE``
+-------------------------------
+ Enable/Disable I2S output of the tuner. This is a private control
+ that can be accessed only using the subdev interface.
+ Refer to Documentation/media/kapi/v4l2-controls for more details.
+
+.. flat-table::
+ :header-rows: 0
+ :stub-columns: 0
+ :widths: 1 4
+
+ * - ``(0)``
+ - I2S output is disabled.
+ * - ``(1)``
+ - I2S output is enabled.
+
+``V4L2_CID_MAX2175_HSLS``
+-------------------------
+ The high-side/low-side (HSLS) control of the tuner for a given band.
+
+.. flat-table::
+ :header-rows: 0
+ :stub-columns: 0
+ :widths: 1 4
+
+ * - ``(0)``
+ - The LO frequency position is below the desired frequency.
+ * - ``(1)``
+ - The LO frequency position is above the desired frequency.
+
+``V4L2_CID_MAX2175_RX_MODE (menu)``
+-----------------------------------
+ The Rx mode controls a number of preset parameters of the tuner like
+ sample clock (sck), sampling rate etc. These multiple settings are
+ provided under one single label called Rx mode in the datasheet. The
+ list below shows the supported modes with a brief description.
+
+.. flat-table::
+ :header-rows: 0
+ :stub-columns: 0
+ :widths: 1 4
+
+ * - ``"Europe modes"``
+ * - ``"FM 1.2" (0)``
+ - This configures FM band with a sample rate of 0.512 million
+ samples/sec with a 10.24 MHz sck.
+ * - ``"DAB 1.2" (1)``
+ - This configures VHF band with a sample rate of 2.048 million
+ samples/sec with a 32.768 MHz sck.
+
+ * - ``"North America modes"``
+ * - ``"FM 1.0" (0)``
+ - This configures FM band with a sample rate of 0.7441875 million
+ samples/sec with a 14.88375 MHz sck.
+ * - ``"DAB 1.2" (1)``
+ - This configures FM band with a sample rate of 0.372 million
+ samples/sec with a 7.441875 MHz sck.
OpenPOWER on IntegriCloud