mirror of
https://github.com/AetherDroid/android_kernel_samsung_on5xelte.git
synced 2025-09-06 08:18:05 -04:00
Fixed MTP to work with TWRP
This commit is contained in:
commit
f6dfaef42e
50820 changed files with 20846062 additions and 0 deletions
56
Documentation/sound/alsa/soc/DAI.txt
Normal file
56
Documentation/sound/alsa/soc/DAI.txt
Normal file
|
@ -0,0 +1,56 @@
|
|||
ASoC currently supports the three main Digital Audio Interfaces (DAI) found on
|
||||
SoC controllers and portable audio CODECs today, namely AC97, I2S and PCM.
|
||||
|
||||
|
||||
AC97
|
||||
====
|
||||
|
||||
AC97 is a five wire interface commonly found on many PC sound cards. It is
|
||||
now also popular in many portable devices. This DAI has a reset line and time
|
||||
multiplexes its data on its SDATA_OUT (playback) and SDATA_IN (capture) lines.
|
||||
The bit clock (BCLK) is always driven by the CODEC (usually 12.288MHz) and the
|
||||
frame (FRAME) (usually 48kHz) is always driven by the controller. Each AC97
|
||||
frame is 21uS long and is divided into 13 time slots.
|
||||
|
||||
The AC97 specification can be found at :-
|
||||
http://www.intel.com/p/en_US/business/design
|
||||
|
||||
|
||||
I2S
|
||||
===
|
||||
|
||||
I2S is a common 4 wire DAI used in HiFi, STB and portable devices. The Tx and
|
||||
Rx lines are used for audio transmission, whilst the bit clock (BCLK) and
|
||||
left/right clock (LRC) synchronise the link. I2S is flexible in that either the
|
||||
controller or CODEC can drive (master) the BCLK and LRC clock lines. Bit clock
|
||||
usually varies depending on the sample rate and the master system clock
|
||||
(SYSCLK). LRCLK is the same as the sample rate. A few devices support separate
|
||||
ADC and DAC LRCLKs, this allows for simultaneous capture and playback at
|
||||
different sample rates.
|
||||
|
||||
I2S has several different operating modes:-
|
||||
|
||||
o I2S - MSB is transmitted on the falling edge of the first BCLK after LRC
|
||||
transition.
|
||||
|
||||
o Left Justified - MSB is transmitted on transition of LRC.
|
||||
|
||||
o Right Justified - MSB is transmitted sample size BCLKs before LRC
|
||||
transition.
|
||||
|
||||
PCM
|
||||
===
|
||||
|
||||
PCM is another 4 wire interface, very similar to I2S, which can support a more
|
||||
flexible protocol. It has bit clock (BCLK) and sync (SYNC) lines that are used
|
||||
to synchronise the link whilst the Tx and Rx lines are used to transmit and
|
||||
receive the audio data. Bit clock usually varies depending on sample rate
|
||||
whilst sync runs at the sample rate. PCM also supports Time Division
|
||||
Multiplexing (TDM) in that several devices can use the bus simultaneously (this
|
||||
is sometimes referred to as network mode).
|
||||
|
||||
Common PCM operating modes:-
|
||||
|
||||
o Mode A - MSB is transmitted on falling edge of first BCLK after FRAME/SYNC.
|
||||
|
||||
o Mode B - MSB is transmitted on rising edge of FRAME/SYNC.
|
380
Documentation/sound/alsa/soc/DPCM.txt
Normal file
380
Documentation/sound/alsa/soc/DPCM.txt
Normal file
|
@ -0,0 +1,380 @@
|
|||
Dynamic PCM
|
||||
===========
|
||||
|
||||
1. Description
|
||||
==============
|
||||
|
||||
Dynamic PCM allows an ALSA PCM device to digitally route its PCM audio to
|
||||
various digital endpoints during the PCM stream runtime. e.g. PCM0 can route
|
||||
digital audio to I2S DAI0, I2S DAI1 or PDM DAI2. This is useful for on SoC DSP
|
||||
drivers that expose several ALSA PCMs and can route to multiple DAIs.
|
||||
|
||||
The DPCM runtime routing is determined by the ALSA mixer settings in the same
|
||||
way as the analog signal is routed in an ASoC codec driver. DPCM uses a DAPM
|
||||
graph representing the DSP internal audio paths and uses the mixer settings to
|
||||
determine the patch used by each ALSA PCM.
|
||||
|
||||
DPCM re-uses all the existing component codec, platform and DAI drivers without
|
||||
any modifications.
|
||||
|
||||
|
||||
Phone Audio System with SoC based DSP
|
||||
-------------------------------------
|
||||
|
||||
Consider the following phone audio subsystem. This will be used in this
|
||||
document for all examples :-
|
||||
|
||||
| Front End PCMs | SoC DSP | Back End DAIs | Audio devices |
|
||||
|
||||
*************
|
||||
PCM0 <------------> * * <----DAI0-----> Codec Headset
|
||||
* *
|
||||
PCM1 <------------> * * <----DAI1-----> Codec Speakers
|
||||
* DSP *
|
||||
PCM2 <------------> * * <----DAI2-----> MODEM
|
||||
* *
|
||||
PCM3 <------------> * * <----DAI3-----> BT
|
||||
* *
|
||||
* * <----DAI4-----> DMIC
|
||||
* *
|
||||
* * <----DAI5-----> FM
|
||||
*************
|
||||
|
||||
This diagram shows a simple smart phone audio subsystem. It supports Bluetooth,
|
||||
FM digital radio, Speakers, Headset Jack, digital microphones and cellular
|
||||
modem. This sound card exposes 4 DSP front end (FE) ALSA PCM devices and
|
||||
supports 6 back end (BE) DAIs. Each FE PCM can digitally route audio data to any
|
||||
of the BE DAIs. The FE PCM devices can also route audio to more than 1 BE DAI.
|
||||
|
||||
|
||||
|
||||
Example - DPCM Switching playback from DAI0 to DAI1
|
||||
---------------------------------------------------
|
||||
|
||||
Audio is being played to the Headset. After a while the user removes the headset
|
||||
and audio continues playing on the speakers.
|
||||
|
||||
Playback on PCM0 to Headset would look like :-
|
||||
|
||||
*************
|
||||
PCM0 <============> * * <====DAI0=====> Codec Headset
|
||||
* *
|
||||
PCM1 <------------> * * <----DAI1-----> Codec Speakers
|
||||
* DSP *
|
||||
PCM2 <------------> * * <----DAI2-----> MODEM
|
||||
* *
|
||||
PCM3 <------------> * * <----DAI3-----> BT
|
||||
* *
|
||||
* * <----DAI4-----> DMIC
|
||||
* *
|
||||
* * <----DAI5-----> FM
|
||||
*************
|
||||
|
||||
The headset is removed from the jack by user so the speakers must now be used :-
|
||||
|
||||
*************
|
||||
PCM0 <============> * * <----DAI0-----> Codec Headset
|
||||
* *
|
||||
PCM1 <------------> * * <====DAI1=====> Codec Speakers
|
||||
* DSP *
|
||||
PCM2 <------------> * * <----DAI2-----> MODEM
|
||||
* *
|
||||
PCM3 <------------> * * <----DAI3-----> BT
|
||||
* *
|
||||
* * <----DAI4-----> DMIC
|
||||
* *
|
||||
* * <----DAI5-----> FM
|
||||
*************
|
||||
|
||||
The audio driver processes this as follows :-
|
||||
|
||||
1) Machine driver receives Jack removal event.
|
||||
|
||||
2) Machine driver OR audio HAL disables the Headset path.
|
||||
|
||||
3) DPCM runs the PCM trigger(stop), hw_free(), shutdown() operations on DAI0
|
||||
for headset since the path is now disabled.
|
||||
|
||||
4) Machine driver or audio HAL enables the speaker path.
|
||||
|
||||
5) DPCM runs the PCM ops for startup(), hw_params(), prepapre() and
|
||||
trigger(start) for DAI1 Speakers since the path is enabled.
|
||||
|
||||
In this example, the machine driver or userspace audio HAL can alter the routing
|
||||
and then DPCM will take care of managing the DAI PCM operations to either bring
|
||||
the link up or down. Audio playback does not stop during this transition.
|
||||
|
||||
|
||||
|
||||
DPCM machine driver
|
||||
===================
|
||||
|
||||
The DPCM enabled ASoC machine driver is similar to normal machine drivers
|
||||
except that we also have to :-
|
||||
|
||||
1) Define the FE and BE DAI links.
|
||||
|
||||
2) Define any FE/BE PCM operations.
|
||||
|
||||
3) Define widget graph connections.
|
||||
|
||||
|
||||
1 FE and BE DAI links
|
||||
---------------------
|
||||
|
||||
| Front End PCMs | SoC DSP | Back End DAIs | Audio devices |
|
||||
|
||||
*************
|
||||
PCM0 <------------> * * <----DAI0-----> Codec Headset
|
||||
* *
|
||||
PCM1 <------------> * * <----DAI1-----> Codec Speakers
|
||||
* DSP *
|
||||
PCM2 <------------> * * <----DAI2-----> MODEM
|
||||
* *
|
||||
PCM3 <------------> * * <----DAI3-----> BT
|
||||
* *
|
||||
* * <----DAI4-----> DMIC
|
||||
* *
|
||||
* * <----DAI5-----> FM
|
||||
*************
|
||||
|
||||
For the example above we have to define 4 FE DAI links and 6 BE DAI links. The
|
||||
FE DAI links are defined as follows :-
|
||||
|
||||
static struct snd_soc_dai_link machine_dais[] = {
|
||||
{
|
||||
.name = "PCM0 System",
|
||||
.stream_name = "System Playback",
|
||||
.cpu_dai_name = "System Pin",
|
||||
.platform_name = "dsp-audio",
|
||||
.codec_name = "snd-soc-dummy",
|
||||
.codec_dai_name = "snd-soc-dummy-dai",
|
||||
.dynamic = 1,
|
||||
.trigger = {SND_SOC_DPCM_TRIGGER_POST, SND_SOC_DPCM_TRIGGER_POST},
|
||||
.dpcm_playback = 1,
|
||||
},
|
||||
.....< other FE and BE DAI links here >
|
||||
};
|
||||
|
||||
This FE DAI link is pretty similar to a regular DAI link except that we also
|
||||
set the DAI link to a DPCM FE with the "dynamic = 1". The supported FE stream
|
||||
directions should also be set with the "dpcm_playback" and "dpcm_capture"
|
||||
flags. There is also an option to specify the ordering of the trigger call for
|
||||
each FE. This allows the ASoC core to trigger the DSP before or after the other
|
||||
components (as some DSPs have strong requirements for the ordering DAI/DSP
|
||||
start and stop sequences).
|
||||
|
||||
The FE DAI above sets the codec and code DAIs to dummy devices since the BE is
|
||||
dynamic and will change depending on runtime config.
|
||||
|
||||
The BE DAIs are configured as follows :-
|
||||
|
||||
static struct snd_soc_dai_link machine_dais[] = {
|
||||
.....< FE DAI links here >
|
||||
{
|
||||
.name = "Codec Headset",
|
||||
.cpu_dai_name = "ssp-dai.0",
|
||||
.platform_name = "snd-soc-dummy",
|
||||
.no_pcm = 1,
|
||||
.codec_name = "rt5640.0-001c",
|
||||
.codec_dai_name = "rt5640-aif1",
|
||||
.ignore_suspend = 1,
|
||||
.ignore_pmdown_time = 1,
|
||||
.be_hw_params_fixup = hswult_ssp0_fixup,
|
||||
.ops = &haswell_ops,
|
||||
.dpcm_playback = 1,
|
||||
.dpcm_capture = 1,
|
||||
},
|
||||
.....< other BE DAI links here >
|
||||
};
|
||||
|
||||
This BE DAI link connects DAI0 to the codec (in this case RT5460 AIF1). It sets
|
||||
the "no_pcm" flag to mark it has a BE and sets flags for supported stream
|
||||
directions using "dpcm_playback" and "dpcm_capture" above.
|
||||
|
||||
The BE has also flags set for ignoring suspend and PM down time. This allows
|
||||
the BE to work in a hostless mode where the host CPU is not transferring data
|
||||
like a BT phone call :-
|
||||
|
||||
*************
|
||||
PCM0 <------------> * * <----DAI0-----> Codec Headset
|
||||
* *
|
||||
PCM1 <------------> * * <----DAI1-----> Codec Speakers
|
||||
* DSP *
|
||||
PCM2 <------------> * * <====DAI2=====> MODEM
|
||||
* *
|
||||
PCM3 <------------> * * <====DAI3=====> BT
|
||||
* *
|
||||
* * <----DAI4-----> DMIC
|
||||
* *
|
||||
* * <----DAI5-----> FM
|
||||
*************
|
||||
|
||||
This allows the host CPU to sleep whilst the DSP, MODEM DAI and the BT DAI are
|
||||
still in operation.
|
||||
|
||||
A BE DAI link can also set the codec to a dummy device if the code is a device
|
||||
that is managed externally.
|
||||
|
||||
Likewise a BE DAI can also set a dummy cpu DAI if the CPU DAI is managed by the
|
||||
DSP firmware.
|
||||
|
||||
|
||||
2 FE/BE PCM operations
|
||||
----------------------
|
||||
|
||||
The BE above also exports some PCM operations and a "fixup" callback. The fixup
|
||||
callback is used by the machine driver to (re)configure the DAI based upon the
|
||||
FE hw params. i.e. the DSP may perform SRC or ASRC from the FE to BE.
|
||||
|
||||
e.g. DSP converts all FE hw params to run at fixed rate of 48k, 16bit, stereo for
|
||||
DAI0. This means all FE hw_params have to be fixed in the machine driver for
|
||||
DAI0 so that the DAI is running at desired configuration regardless of the FE
|
||||
configuration.
|
||||
|
||||
static int dai0_fixup(struct snd_soc_pcm_runtime *rtd,
|
||||
struct snd_pcm_hw_params *params)
|
||||
{
|
||||
struct snd_interval *rate = hw_param_interval(params,
|
||||
SNDRV_PCM_HW_PARAM_RATE);
|
||||
struct snd_interval *channels = hw_param_interval(params,
|
||||
SNDRV_PCM_HW_PARAM_CHANNELS);
|
||||
|
||||
/* The DSP will covert the FE rate to 48k, stereo */
|
||||
rate->min = rate->max = 48000;
|
||||
channels->min = channels->max = 2;
|
||||
|
||||
/* set DAI0 to 16 bit */
|
||||
snd_mask_set(¶ms->masks[SNDRV_PCM_HW_PARAM_FORMAT -
|
||||
SNDRV_PCM_HW_PARAM_FIRST_MASK],
|
||||
SNDRV_PCM_FORMAT_S16_LE);
|
||||
return 0;
|
||||
}
|
||||
|
||||
The other PCM operation are the same as for regular DAI links. Use as necessary.
|
||||
|
||||
|
||||
3 Widget graph connections
|
||||
--------------------------
|
||||
|
||||
The BE DAI links will normally be connected to the graph at initialisation time
|
||||
by the ASoC DAPM core. However, if the BE codec or BE DAI is a dummy then this
|
||||
has to be set explicitly in the driver :-
|
||||
|
||||
/* BE for codec Headset - DAI0 is dummy and managed by DSP FW */
|
||||
{"DAI0 CODEC IN", NULL, "AIF1 Capture"},
|
||||
{"AIF1 Playback", NULL, "DAI0 CODEC OUT"},
|
||||
|
||||
|
||||
Writing a DPCM DSP driver
|
||||
=========================
|
||||
|
||||
The DPCM DSP driver looks much like a standard platform class ASoC driver
|
||||
combined with elements from a codec class driver. A DSP platform driver must
|
||||
implement :-
|
||||
|
||||
1) Front End PCM DAIs - i.e. struct snd_soc_dai_driver.
|
||||
|
||||
2) DAPM graph showing DSP audio routing from FE DAIs to BEs.
|
||||
|
||||
3) DAPM widgets from DSP graph.
|
||||
|
||||
4) Mixers for gains, routing, etc.
|
||||
|
||||
5) DMA configuration.
|
||||
|
||||
6) BE AIF widgets.
|
||||
|
||||
Items 6 is important for routing the audio outside of the DSP. AIF need to be
|
||||
defined for each BE and each stream direction. e.g for BE DAI0 above we would
|
||||
have :-
|
||||
|
||||
SND_SOC_DAPM_AIF_IN("DAI0 RX", NULL, 0, SND_SOC_NOPM, 0, 0),
|
||||
SND_SOC_DAPM_AIF_OUT("DAI0 TX", NULL, 0, SND_SOC_NOPM, 0, 0),
|
||||
|
||||
The BE AIF are used to connect the DSP graph to the graphs for the other
|
||||
component drivers (e.g. codec graph).
|
||||
|
||||
|
||||
Hostless PCM streams
|
||||
====================
|
||||
|
||||
A hostless PCM stream is a stream that is not routed through the host CPU. An
|
||||
example of this would be a phone call from handset to modem.
|
||||
|
||||
|
||||
*************
|
||||
PCM0 <------------> * * <----DAI0-----> Codec Headset
|
||||
* *
|
||||
PCM1 <------------> * * <====DAI1=====> Codec Speakers/Mic
|
||||
* DSP *
|
||||
PCM2 <------------> * * <====DAI2=====> MODEM
|
||||
* *
|
||||
PCM3 <------------> * * <----DAI3-----> BT
|
||||
* *
|
||||
* * <----DAI4-----> DMIC
|
||||
* *
|
||||
* * <----DAI5-----> FM
|
||||
*************
|
||||
|
||||
In this case the PCM data is routed via the DSP. The host CPU in this use case
|
||||
is only used for control and can sleep during the runtime of the stream.
|
||||
|
||||
The host can control the hostless link either by :-
|
||||
|
||||
1) Configuring the link as a CODEC <-> CODEC style link. In this case the link
|
||||
is enabled or disabled by the state of the DAPM graph. This usually means
|
||||
there is a mixer control that can be used to connect or disconnect the path
|
||||
between both DAIs.
|
||||
|
||||
2) Hostless FE. This FE has a virtual connection to the BE DAI links on the DAPM
|
||||
graph. Control is then carried out by the FE as regular PCM operations.
|
||||
This method gives more control over the DAI links, but requires much more
|
||||
userspace code to control the link. Its recommended to use CODEC<->CODEC
|
||||
unless your HW needs more fine grained sequencing of the PCM ops.
|
||||
|
||||
|
||||
CODEC <-> CODEC link
|
||||
--------------------
|
||||
|
||||
This DAI link is enabled when DAPM detects a valid path within the DAPM graph.
|
||||
The machine driver sets some additional parameters to the DAI link i.e.
|
||||
|
||||
static const struct snd_soc_pcm_stream dai_params = {
|
||||
.formats = SNDRV_PCM_FMTBIT_S32_LE,
|
||||
.rate_min = 8000,
|
||||
.rate_max = 8000,
|
||||
.channels_min = 2,
|
||||
.channels_max = 2,
|
||||
};
|
||||
|
||||
static struct snd_soc_dai_link dais[] = {
|
||||
< ... more DAI links above ... >
|
||||
{
|
||||
.name = "MODEM",
|
||||
.stream_name = "MODEM",
|
||||
.cpu_dai_name = "dai2",
|
||||
.codec_dai_name = "modem-aif1",
|
||||
.codec_name = "modem",
|
||||
.dai_fmt = SND_SOC_DAIFMT_I2S | SND_SOC_DAIFMT_NB_NF
|
||||
| SND_SOC_DAIFMT_CBM_CFM,
|
||||
.params = &dai_params,
|
||||
}
|
||||
< ... more DAI links here ... >
|
||||
|
||||
These parameters are used to configure the DAI hw_params() when DAPM detects a
|
||||
valid path and then calls the PCM operations to start the link. DAPM will also
|
||||
call the appropriate PCM operations to disable the DAI when the path is no
|
||||
longer valid.
|
||||
|
||||
|
||||
Hostless FE
|
||||
-----------
|
||||
|
||||
The DAI link(s) are enabled by a FE that does not read or write any PCM data.
|
||||
This means creating a new FE that is connected with a virtual path to both
|
||||
DAI links. The DAI links will be started when the FE PCM is started and stopped
|
||||
when the FE PCM is stopped. Note that the FE PCM cannot read or write data in
|
||||
this configuration.
|
||||
|
||||
|
51
Documentation/sound/alsa/soc/clocking.txt
Normal file
51
Documentation/sound/alsa/soc/clocking.txt
Normal file
|
@ -0,0 +1,51 @@
|
|||
Audio Clocking
|
||||
==============
|
||||
|
||||
This text describes the audio clocking terms in ASoC and digital audio in
|
||||
general. Note: Audio clocking can be complex!
|
||||
|
||||
|
||||
Master Clock
|
||||
------------
|
||||
|
||||
Every audio subsystem is driven by a master clock (sometimes referred to as MCLK
|
||||
or SYSCLK). This audio master clock can be derived from a number of sources
|
||||
(e.g. crystal, PLL, CPU clock) and is responsible for producing the correct
|
||||
audio playback and capture sample rates.
|
||||
|
||||
Some master clocks (e.g. PLLs and CPU based clocks) are configurable in that
|
||||
their speed can be altered by software (depending on the system use and to save
|
||||
power). Other master clocks are fixed at a set frequency (i.e. crystals).
|
||||
|
||||
|
||||
DAI Clocks
|
||||
----------
|
||||
The Digital Audio Interface is usually driven by a Bit Clock (often referred to
|
||||
as BCLK). This clock is used to drive the digital audio data across the link
|
||||
between the codec and CPU.
|
||||
|
||||
The DAI also has a frame clock to signal the start of each audio frame. This
|
||||
clock is sometimes referred to as LRC (left right clock) or FRAME. This clock
|
||||
runs at exactly the sample rate (LRC = Rate).
|
||||
|
||||
Bit Clock can be generated as follows:-
|
||||
|
||||
BCLK = MCLK / x
|
||||
|
||||
or
|
||||
|
||||
BCLK = LRC * x
|
||||
|
||||
or
|
||||
|
||||
BCLK = LRC * Channels * Word Size
|
||||
|
||||
This relationship depends on the codec or SoC CPU in particular. In general
|
||||
it is best to configure BCLK to the lowest possible speed (depending on your
|
||||
rate, number of channels and word size) to save on power.
|
||||
|
||||
It is also desirable to use the codec (if possible) to drive (or master) the
|
||||
audio clocks as it usually gives more accurate sample rates than the CPU.
|
||||
|
||||
|
||||
|
179
Documentation/sound/alsa/soc/codec.txt
Normal file
179
Documentation/sound/alsa/soc/codec.txt
Normal file
|
@ -0,0 +1,179 @@
|
|||
ASoC Codec Class Driver
|
||||
=======================
|
||||
|
||||
The codec class driver is generic and hardware independent code that configures
|
||||
the codec, FM, MODEM, BT or external DSP to provide audio capture and playback.
|
||||
It should contain no code that is specific to the target platform or machine.
|
||||
All platform and machine specific code should be added to the platform and
|
||||
machine drivers respectively.
|
||||
|
||||
Each codec class driver *must* provide the following features:-
|
||||
|
||||
1) Codec DAI and PCM configuration
|
||||
2) Codec control IO - using RegMap API
|
||||
3) Mixers and audio controls
|
||||
4) Codec audio operations
|
||||
5) DAPM description.
|
||||
6) DAPM event handler.
|
||||
|
||||
Optionally, codec drivers can also provide:-
|
||||
|
||||
7) DAC Digital mute control.
|
||||
|
||||
Its probably best to use this guide in conjunction with the existing codec
|
||||
driver code in sound/soc/codecs/
|
||||
|
||||
ASoC Codec driver breakdown
|
||||
===========================
|
||||
|
||||
1 - Codec DAI and PCM configuration
|
||||
-----------------------------------
|
||||
Each codec driver must have a struct snd_soc_dai_driver to define its DAI and
|
||||
PCM capabilities and operations. This struct is exported so that it can be
|
||||
registered with the core by your machine driver.
|
||||
|
||||
e.g.
|
||||
|
||||
static struct snd_soc_dai_ops wm8731_dai_ops = {
|
||||
.prepare = wm8731_pcm_prepare,
|
||||
.hw_params = wm8731_hw_params,
|
||||
.shutdown = wm8731_shutdown,
|
||||
.digital_mute = wm8731_mute,
|
||||
.set_sysclk = wm8731_set_dai_sysclk,
|
||||
.set_fmt = wm8731_set_dai_fmt,
|
||||
};
|
||||
|
||||
struct snd_soc_dai_driver wm8731_dai = {
|
||||
.name = "wm8731-hifi",
|
||||
.playback = {
|
||||
.stream_name = "Playback",
|
||||
.channels_min = 1,
|
||||
.channels_max = 2,
|
||||
.rates = WM8731_RATES,
|
||||
.formats = WM8731_FORMATS,},
|
||||
.capture = {
|
||||
.stream_name = "Capture",
|
||||
.channels_min = 1,
|
||||
.channels_max = 2,
|
||||
.rates = WM8731_RATES,
|
||||
.formats = WM8731_FORMATS,},
|
||||
.ops = &wm8731_dai_ops,
|
||||
.symmetric_rates = 1,
|
||||
};
|
||||
|
||||
|
||||
2 - Codec control IO
|
||||
--------------------
|
||||
The codec can usually be controlled via an I2C or SPI style interface
|
||||
(AC97 combines control with data in the DAI). The codec driver should use the
|
||||
Regmap API for all codec IO. Please see include/linux/regmap.h and existing
|
||||
codec drivers for example regmap usage.
|
||||
|
||||
|
||||
3 - Mixers and audio controls
|
||||
-----------------------------
|
||||
All the codec mixers and audio controls can be defined using the convenience
|
||||
macros defined in soc.h.
|
||||
|
||||
#define SOC_SINGLE(xname, reg, shift, mask, invert)
|
||||
|
||||
Defines a single control as follows:-
|
||||
|
||||
xname = Control name e.g. "Playback Volume"
|
||||
reg = codec register
|
||||
shift = control bit(s) offset in register
|
||||
mask = control bit size(s) e.g. mask of 7 = 3 bits
|
||||
invert = the control is inverted
|
||||
|
||||
Other macros include:-
|
||||
|
||||
#define SOC_DOUBLE(xname, reg, shift_left, shift_right, mask, invert)
|
||||
|
||||
A stereo control
|
||||
|
||||
#define SOC_DOUBLE_R(xname, reg_left, reg_right, shift, mask, invert)
|
||||
|
||||
A stereo control spanning 2 registers
|
||||
|
||||
#define SOC_ENUM_SINGLE(xreg, xshift, xmask, xtexts)
|
||||
|
||||
Defines an single enumerated control as follows:-
|
||||
|
||||
xreg = register
|
||||
xshift = control bit(s) offset in register
|
||||
xmask = control bit(s) size
|
||||
xtexts = pointer to array of strings that describe each setting
|
||||
|
||||
#define SOC_ENUM_DOUBLE(xreg, xshift_l, xshift_r, xmask, xtexts)
|
||||
|
||||
Defines a stereo enumerated control
|
||||
|
||||
|
||||
4 - Codec Audio Operations
|
||||
--------------------------
|
||||
The codec driver also supports the following ALSA PCM operations:-
|
||||
|
||||
/* SoC audio ops */
|
||||
struct snd_soc_ops {
|
||||
int (*startup)(struct snd_pcm_substream *);
|
||||
void (*shutdown)(struct snd_pcm_substream *);
|
||||
int (*hw_params)(struct snd_pcm_substream *, struct snd_pcm_hw_params *);
|
||||
int (*hw_free)(struct snd_pcm_substream *);
|
||||
int (*prepare)(struct snd_pcm_substream *);
|
||||
};
|
||||
|
||||
Please refer to the ALSA driver PCM documentation for details.
|
||||
http://www.alsa-project.org/~iwai/writing-an-alsa-driver/
|
||||
|
||||
|
||||
5 - DAPM description.
|
||||
---------------------
|
||||
The Dynamic Audio Power Management description describes the codec power
|
||||
components and their relationships and registers to the ASoC core.
|
||||
Please read dapm.txt for details of building the description.
|
||||
|
||||
Please also see the examples in other codec drivers.
|
||||
|
||||
|
||||
6 - DAPM event handler
|
||||
----------------------
|
||||
This function is a callback that handles codec domain PM calls and system
|
||||
domain PM calls (e.g. suspend and resume). It is used to put the codec
|
||||
to sleep when not in use.
|
||||
|
||||
Power states:-
|
||||
|
||||
SNDRV_CTL_POWER_D0: /* full On */
|
||||
/* vref/mid, clk and osc on, active */
|
||||
|
||||
SNDRV_CTL_POWER_D1: /* partial On */
|
||||
SNDRV_CTL_POWER_D2: /* partial On */
|
||||
|
||||
SNDRV_CTL_POWER_D3hot: /* Off, with power */
|
||||
/* everything off except vref/vmid, inactive */
|
||||
|
||||
SNDRV_CTL_POWER_D3cold: /* Everything Off, without power */
|
||||
|
||||
|
||||
7 - Codec DAC digital mute control
|
||||
----------------------------------
|
||||
Most codecs have a digital mute before the DACs that can be used to
|
||||
minimise any system noise. The mute stops any digital data from
|
||||
entering the DAC.
|
||||
|
||||
A callback can be created that is called by the core for each codec DAI
|
||||
when the mute is applied or freed.
|
||||
|
||||
i.e.
|
||||
|
||||
static int wm8974_mute(struct snd_soc_dai *dai, int mute)
|
||||
{
|
||||
struct snd_soc_codec *codec = dai->codec;
|
||||
u16 mute_reg = snd_soc_read(codec, WM8974_DAC) & 0xffbf;
|
||||
|
||||
if (mute)
|
||||
snd_soc_write(codec, WM8974_DAC, mute_reg | 0x40);
|
||||
else
|
||||
snd_soc_write(codec, WM8974_DAC, mute_reg);
|
||||
return 0;
|
||||
}
|
305
Documentation/sound/alsa/soc/dapm.txt
Normal file
305
Documentation/sound/alsa/soc/dapm.txt
Normal file
|
@ -0,0 +1,305 @@
|
|||
Dynamic Audio Power Management for Portable Devices
|
||||
===================================================
|
||||
|
||||
1. Description
|
||||
==============
|
||||
|
||||
Dynamic Audio Power Management (DAPM) is designed to allow portable
|
||||
Linux devices to use the minimum amount of power within the audio
|
||||
subsystem at all times. It is independent of other kernel PM and as
|
||||
such, can easily co-exist with the other PM systems.
|
||||
|
||||
DAPM is also completely transparent to all user space applications as
|
||||
all power switching is done within the ASoC core. No code changes or
|
||||
recompiling are required for user space applications. DAPM makes power
|
||||
switching decisions based upon any audio stream (capture/playback)
|
||||
activity and audio mixer settings within the device.
|
||||
|
||||
DAPM spans the whole machine. It covers power control within the entire
|
||||
audio subsystem, this includes internal codec power blocks and machine
|
||||
level power systems.
|
||||
|
||||
There are 4 power domains within DAPM
|
||||
|
||||
1. Codec bias domain - VREF, VMID (core codec and audio power)
|
||||
Usually controlled at codec probe/remove and suspend/resume, although
|
||||
can be set at stream time if power is not needed for sidetone, etc.
|
||||
|
||||
2. Platform/Machine domain - physically connected inputs and outputs
|
||||
Is platform/machine and user action specific, is configured by the
|
||||
machine driver and responds to asynchronous events e.g when HP
|
||||
are inserted
|
||||
|
||||
3. Path domain - audio subsystem signal paths
|
||||
Automatically set when mixer and mux settings are changed by the user.
|
||||
e.g. alsamixer, amixer.
|
||||
|
||||
4. Stream domain - DACs and ADCs.
|
||||
Enabled and disabled when stream playback/capture is started and
|
||||
stopped respectively. e.g. aplay, arecord.
|
||||
|
||||
All DAPM power switching decisions are made automatically by consulting an audio
|
||||
routing map of the whole machine. This map is specific to each machine and
|
||||
consists of the interconnections between every audio component (including
|
||||
internal codec components). All audio components that effect power are called
|
||||
widgets hereafter.
|
||||
|
||||
|
||||
2. DAPM Widgets
|
||||
===============
|
||||
|
||||
Audio DAPM widgets fall into a number of types:-
|
||||
|
||||
o Mixer - Mixes several analog signals into a single analog signal.
|
||||
o Mux - An analog switch that outputs only one of many inputs.
|
||||
o PGA - A programmable gain amplifier or attenuation widget.
|
||||
o ADC - Analog to Digital Converter
|
||||
o DAC - Digital to Analog Converter
|
||||
o Switch - An analog switch
|
||||
o Input - A codec input pin
|
||||
o Output - A codec output pin
|
||||
o Headphone - Headphone (and optional Jack)
|
||||
o Mic - Mic (and optional Jack)
|
||||
o Line - Line Input/Output (and optional Jack)
|
||||
o Speaker - Speaker
|
||||
o Supply - Power or clock supply widget used by other widgets.
|
||||
o Regulator - External regulator that supplies power to audio components.
|
||||
o Clock - External clock that supplies clock to audio components.
|
||||
o AIF IN - Audio Interface Input (with TDM slot mask).
|
||||
o AIF OUT - Audio Interface Output (with TDM slot mask).
|
||||
o Siggen - Signal Generator.
|
||||
o DAI IN - Digital Audio Interface Input.
|
||||
o DAI OUT - Digital Audio Interface Output.
|
||||
o DAI Link - DAI Link between two DAI structures */
|
||||
o Pre - Special PRE widget (exec before all others)
|
||||
o Post - Special POST widget (exec after all others)
|
||||
|
||||
(Widgets are defined in include/sound/soc-dapm.h)
|
||||
|
||||
Widgets can be added to the sound card by any of the component driver types.
|
||||
There are convenience macros defined in soc-dapm.h that can be used to quickly
|
||||
build a list of widgets of the codecs and machines DAPM widgets.
|
||||
|
||||
Most widgets have a name, register, shift and invert. Some widgets have extra
|
||||
parameters for stream name and kcontrols.
|
||||
|
||||
|
||||
2.1 Stream Domain Widgets
|
||||
-------------------------
|
||||
|
||||
Stream Widgets relate to the stream power domain and only consist of ADCs
|
||||
(analog to digital converters), DACs (digital to analog converters),
|
||||
AIF IN and AIF OUT.
|
||||
|
||||
Stream widgets have the following format:-
|
||||
|
||||
SND_SOC_DAPM_DAC(name, stream name, reg, shift, invert),
|
||||
SND_SOC_DAPM_AIF_IN(name, stream, slot, reg, shift, invert)
|
||||
|
||||
NOTE: the stream name must match the corresponding stream name in your codec
|
||||
snd_soc_codec_dai.
|
||||
|
||||
e.g. stream widgets for HiFi playback and capture
|
||||
|
||||
SND_SOC_DAPM_DAC("HiFi DAC", "HiFi Playback", REG, 3, 1),
|
||||
SND_SOC_DAPM_ADC("HiFi ADC", "HiFi Capture", REG, 2, 1),
|
||||
|
||||
e.g. stream widgets for AIF
|
||||
|
||||
SND_SOC_DAPM_AIF_IN("AIF1RX", "AIF1 Playback", 0, SND_SOC_NOPM, 0, 0),
|
||||
SND_SOC_DAPM_AIF_OUT("AIF1TX", "AIF1 Capture", 0, SND_SOC_NOPM, 0, 0),
|
||||
|
||||
|
||||
2.2 Path Domain Widgets
|
||||
-----------------------
|
||||
|
||||
Path domain widgets have a ability to control or affect the audio signal or
|
||||
audio paths within the audio subsystem. They have the following form:-
|
||||
|
||||
SND_SOC_DAPM_PGA(name, reg, shift, invert, controls, num_controls)
|
||||
|
||||
Any widget kcontrols can be set using the controls and num_controls members.
|
||||
|
||||
e.g. Mixer widget (the kcontrols are declared first)
|
||||
|
||||
/* Output Mixer */
|
||||
static const snd_kcontrol_new_t wm8731_output_mixer_controls[] = {
|
||||
SOC_DAPM_SINGLE("Line Bypass Switch", WM8731_APANA, 3, 1, 0),
|
||||
SOC_DAPM_SINGLE("Mic Sidetone Switch", WM8731_APANA, 5, 1, 0),
|
||||
SOC_DAPM_SINGLE("HiFi Playback Switch", WM8731_APANA, 4, 1, 0),
|
||||
};
|
||||
|
||||
SND_SOC_DAPM_MIXER("Output Mixer", WM8731_PWR, 4, 1, wm8731_output_mixer_controls,
|
||||
ARRAY_SIZE(wm8731_output_mixer_controls)),
|
||||
|
||||
If you dont want the mixer elements prefixed with the name of the mixer widget,
|
||||
you can use SND_SOC_DAPM_MIXER_NAMED_CTL instead. the parameters are the same
|
||||
as for SND_SOC_DAPM_MIXER.
|
||||
|
||||
|
||||
2.3 Machine domain Widgets
|
||||
--------------------------
|
||||
|
||||
Machine widgets are different from codec widgets in that they don't have a
|
||||
codec register bit associated with them. A machine widget is assigned to each
|
||||
machine audio component (non codec or DSP) that can be independently
|
||||
powered. e.g.
|
||||
|
||||
o Speaker Amp
|
||||
o Microphone Bias
|
||||
o Jack connectors
|
||||
|
||||
A machine widget can have an optional call back.
|
||||
|
||||
e.g. Jack connector widget for an external Mic that enables Mic Bias
|
||||
when the Mic is inserted:-
|
||||
|
||||
static int spitz_mic_bias(struct snd_soc_dapm_widget* w, int event)
|
||||
{
|
||||
gpio_set_value(SPITZ_GPIO_MIC_BIAS, SND_SOC_DAPM_EVENT_ON(event));
|
||||
return 0;
|
||||
}
|
||||
|
||||
SND_SOC_DAPM_MIC("Mic Jack", spitz_mic_bias),
|
||||
|
||||
|
||||
2.4 Codec (BIAS) Domain
|
||||
-----------------------
|
||||
|
||||
The codec bias power domain has no widgets and is handled by the codecs DAPM
|
||||
event handler. This handler is called when the codec powerstate is changed wrt
|
||||
to any stream event or by kernel PM events.
|
||||
|
||||
|
||||
2.5 Virtual Widgets
|
||||
-------------------
|
||||
|
||||
Sometimes widgets exist in the codec or machine audio map that don't have any
|
||||
corresponding soft power control. In this case it is necessary to create
|
||||
a virtual widget - a widget with no control bits e.g.
|
||||
|
||||
SND_SOC_DAPM_MIXER("AC97 Mixer", SND_SOC_DAPM_NOPM, 0, 0, NULL, 0),
|
||||
|
||||
This can be used to merge to signal paths together in software.
|
||||
|
||||
After all the widgets have been defined, they can then be added to the DAPM
|
||||
subsystem individually with a call to snd_soc_dapm_new_control().
|
||||
|
||||
|
||||
3. Codec/DSP Widget Interconnections
|
||||
====================================
|
||||
|
||||
Widgets are connected to each other within the codec, platform and machine by
|
||||
audio paths (called interconnections). Each interconnection must be defined in
|
||||
order to create a map of all audio paths between widgets.
|
||||
|
||||
This is easiest with a diagram of the codec or DSP (and schematic of the machine
|
||||
audio system), as it requires joining widgets together via their audio signal
|
||||
paths.
|
||||
|
||||
e.g., from the WM8731 output mixer (wm8731.c)
|
||||
|
||||
The WM8731 output mixer has 3 inputs (sources)
|
||||
|
||||
1. Line Bypass Input
|
||||
2. DAC (HiFi playback)
|
||||
3. Mic Sidetone Input
|
||||
|
||||
Each input in this example has a kcontrol associated with it (defined in example
|
||||
above) and is connected to the output mixer via its kcontrol name. We can now
|
||||
connect the destination widget (wrt audio signal) with its source widgets.
|
||||
|
||||
/* output mixer */
|
||||
{"Output Mixer", "Line Bypass Switch", "Line Input"},
|
||||
{"Output Mixer", "HiFi Playback Switch", "DAC"},
|
||||
{"Output Mixer", "Mic Sidetone Switch", "Mic Bias"},
|
||||
|
||||
So we have :-
|
||||
|
||||
Destination Widget <=== Path Name <=== Source Widget
|
||||
|
||||
Or:-
|
||||
|
||||
Sink, Path, Source
|
||||
|
||||
Or :-
|
||||
|
||||
"Output Mixer" is connected to the "DAC" via the "HiFi Playback Switch".
|
||||
|
||||
When there is no path name connecting widgets (e.g. a direct connection) we
|
||||
pass NULL for the path name.
|
||||
|
||||
Interconnections are created with a call to:-
|
||||
|
||||
snd_soc_dapm_connect_input(codec, sink, path, source);
|
||||
|
||||
Finally, snd_soc_dapm_new_widgets(codec) must be called after all widgets and
|
||||
interconnections have been registered with the core. This causes the core to
|
||||
scan the codec and machine so that the internal DAPM state matches the
|
||||
physical state of the machine.
|
||||
|
||||
|
||||
3.1 Machine Widget Interconnections
|
||||
-----------------------------------
|
||||
Machine widget interconnections are created in the same way as codec ones and
|
||||
directly connect the codec pins to machine level widgets.
|
||||
|
||||
e.g. connects the speaker out codec pins to the internal speaker.
|
||||
|
||||
/* ext speaker connected to codec pins LOUT2, ROUT2 */
|
||||
{"Ext Spk", NULL , "ROUT2"},
|
||||
{"Ext Spk", NULL , "LOUT2"},
|
||||
|
||||
This allows the DAPM to power on and off pins that are connected (and in use)
|
||||
and pins that are NC respectively.
|
||||
|
||||
|
||||
4 Endpoint Widgets
|
||||
===================
|
||||
An endpoint is a start or end point (widget) of an audio signal within the
|
||||
machine and includes the codec. e.g.
|
||||
|
||||
o Headphone Jack
|
||||
o Internal Speaker
|
||||
o Internal Mic
|
||||
o Mic Jack
|
||||
o Codec Pins
|
||||
|
||||
Endpoints are added to the DAPM graph so that their usage can be determined in
|
||||
order to save power. e.g. NC codecs pins will be switched OFF, unconnected
|
||||
jacks can also be switched OFF.
|
||||
|
||||
|
||||
5 DAPM Widget Events
|
||||
====================
|
||||
|
||||
Some widgets can register their interest with the DAPM core in PM events.
|
||||
e.g. A Speaker with an amplifier registers a widget so the amplifier can be
|
||||
powered only when the spk is in use.
|
||||
|
||||
/* turn speaker amplifier on/off depending on use */
|
||||
static int corgi_amp_event(struct snd_soc_dapm_widget *w, int event)
|
||||
{
|
||||
gpio_set_value(CORGI_GPIO_APM_ON, SND_SOC_DAPM_EVENT_ON(event));
|
||||
return 0;
|
||||
}
|
||||
|
||||
/* corgi machine dapm widgets */
|
||||
static const struct snd_soc_dapm_widget wm8731_dapm_widgets =
|
||||
SND_SOC_DAPM_SPK("Ext Spk", corgi_amp_event);
|
||||
|
||||
Please see soc-dapm.h for all other widgets that support events.
|
||||
|
||||
|
||||
5.1 Event types
|
||||
---------------
|
||||
|
||||
The following event types are supported by event widgets.
|
||||
|
||||
/* dapm event types */
|
||||
#define SND_SOC_DAPM_PRE_PMU 0x1 /* before widget power up */
|
||||
#define SND_SOC_DAPM_POST_PMU 0x2 /* after widget power up */
|
||||
#define SND_SOC_DAPM_PRE_PMD 0x4 /* before widget power down */
|
||||
#define SND_SOC_DAPM_POST_PMD 0x8 /* after widget power down */
|
||||
#define SND_SOC_DAPM_PRE_REG 0x10 /* before audio path setup */
|
||||
#define SND_SOC_DAPM_POST_REG 0x20 /* after audio path setup */
|
71
Documentation/sound/alsa/soc/jack.txt
Normal file
71
Documentation/sound/alsa/soc/jack.txt
Normal file
|
@ -0,0 +1,71 @@
|
|||
ASoC jack detection
|
||||
===================
|
||||
|
||||
ALSA has a standard API for representing physical jacks to user space,
|
||||
the kernel side of which can be seen in include/sound/jack.h. ASoC
|
||||
provides a version of this API adding two additional features:
|
||||
|
||||
- It allows more than one jack detection method to work together on one
|
||||
user visible jack. In embedded systems it is common for multiple
|
||||
to be present on a single jack but handled by separate bits of
|
||||
hardware.
|
||||
|
||||
- Integration with DAPM, allowing DAPM endpoints to be updated
|
||||
automatically based on the detected jack status (eg, turning off the
|
||||
headphone outputs if no headphones are present).
|
||||
|
||||
This is done by splitting the jacks up into three things working
|
||||
together: the jack itself represented by a struct snd_soc_jack, sets of
|
||||
snd_soc_jack_pins representing DAPM endpoints to update and blocks of
|
||||
code providing jack reporting mechanisms.
|
||||
|
||||
For example, a system may have a stereo headset jack with two reporting
|
||||
mechanisms, one for the headphone and one for the microphone. Some
|
||||
systems won't be able to use their speaker output while a headphone is
|
||||
connected and so will want to make sure to update both speaker and
|
||||
headphone when the headphone jack status changes.
|
||||
|
||||
The jack - struct snd_soc_jack
|
||||
==============================
|
||||
|
||||
This represents a physical jack on the system and is what is visible to
|
||||
user space. The jack itself is completely passive, it is set up by the
|
||||
machine driver and updated by jack detection methods.
|
||||
|
||||
Jacks are created by the machine driver calling snd_soc_jack_new().
|
||||
|
||||
snd_soc_jack_pin
|
||||
================
|
||||
|
||||
These represent a DAPM pin to update depending on some of the status
|
||||
bits supported by the jack. Each snd_soc_jack has zero or more of these
|
||||
which are updated automatically. They are created by the machine driver
|
||||
and associated with the jack using snd_soc_jack_add_pins(). The status
|
||||
of the endpoint may configured to be the opposite of the jack status if
|
||||
required (eg, enabling a built in microphone if a microphone is not
|
||||
connected via a jack).
|
||||
|
||||
Jack detection methods
|
||||
======================
|
||||
|
||||
Actual jack detection is done by code which is able to monitor some
|
||||
input to the system and update a jack by calling snd_soc_jack_report(),
|
||||
specifying a subset of bits to update. The jack detection code should
|
||||
be set up by the machine driver, taking configuration for the jack to
|
||||
update and the set of things to report when the jack is connected.
|
||||
|
||||
Often this is done based on the status of a GPIO - a handler for this is
|
||||
provided by the snd_soc_jack_add_gpio() function. Other methods are
|
||||
also available, for example integrated into CODECs. One example of
|
||||
CODEC integrated jack detection can be see in the WM8350 driver.
|
||||
|
||||
Each jack may have multiple reporting mechanisms, though it will need at
|
||||
least one to be useful.
|
||||
|
||||
Machine drivers
|
||||
===============
|
||||
|
||||
These are all hooked together by the machine driver depending on the
|
||||
system hardware. The machine driver will set up the snd_soc_jack and
|
||||
the list of pins to update then set up one or more jack detection
|
||||
mechanisms to update that jack based on their current status.
|
93
Documentation/sound/alsa/soc/machine.txt
Normal file
93
Documentation/sound/alsa/soc/machine.txt
Normal file
|
@ -0,0 +1,93 @@
|
|||
ASoC Machine Driver
|
||||
===================
|
||||
|
||||
The ASoC machine (or board) driver is the code that glues together all the
|
||||
component drivers (e.g. codecs, platforms and DAIs). It also describes the
|
||||
relationships between each componnent which include audio paths, GPIOs,
|
||||
interrupts, clocking, jacks and voltage regulators.
|
||||
|
||||
The machine driver can contain codec and platform specific code. It registers
|
||||
the audio subsystem with the kernel as a platform device and is represented by
|
||||
the following struct:-
|
||||
|
||||
/* SoC machine */
|
||||
struct snd_soc_card {
|
||||
char *name;
|
||||
|
||||
...
|
||||
|
||||
int (*probe)(struct platform_device *pdev);
|
||||
int (*remove)(struct platform_device *pdev);
|
||||
|
||||
/* the pre and post PM functions are used to do any PM work before and
|
||||
* after the codec and DAIs do any PM work. */
|
||||
int (*suspend_pre)(struct platform_device *pdev, pm_message_t state);
|
||||
int (*suspend_post)(struct platform_device *pdev, pm_message_t state);
|
||||
int (*resume_pre)(struct platform_device *pdev);
|
||||
int (*resume_post)(struct platform_device *pdev);
|
||||
|
||||
...
|
||||
|
||||
/* CPU <--> Codec DAI links */
|
||||
struct snd_soc_dai_link *dai_link;
|
||||
int num_links;
|
||||
|
||||
...
|
||||
};
|
||||
|
||||
probe()/remove()
|
||||
----------------
|
||||
probe/remove are optional. Do any machine specific probe here.
|
||||
|
||||
|
||||
suspend()/resume()
|
||||
------------------
|
||||
The machine driver has pre and post versions of suspend and resume to take care
|
||||
of any machine audio tasks that have to be done before or after the codec, DAIs
|
||||
and DMA is suspended and resumed. Optional.
|
||||
|
||||
|
||||
Machine DAI Configuration
|
||||
-------------------------
|
||||
The machine DAI configuration glues all the codec and CPU DAIs together. It can
|
||||
also be used to set up the DAI system clock and for any machine related DAI
|
||||
initialisation e.g. the machine audio map can be connected to the codec audio
|
||||
map, unconnected codec pins can be set as such.
|
||||
|
||||
struct snd_soc_dai_link is used to set up each DAI in your machine. e.g.
|
||||
|
||||
/* corgi digital audio interface glue - connects codec <--> CPU */
|
||||
static struct snd_soc_dai_link corgi_dai = {
|
||||
.name = "WM8731",
|
||||
.stream_name = "WM8731",
|
||||
.cpu_dai_name = "pxa-is2-dai",
|
||||
.codec_dai_name = "wm8731-hifi",
|
||||
.platform_name = "pxa-pcm-audio",
|
||||
.codec_name = "wm8713-codec.0-001a",
|
||||
.init = corgi_wm8731_init,
|
||||
.ops = &corgi_ops,
|
||||
};
|
||||
|
||||
struct snd_soc_card then sets up the machine with its DAIs. e.g.
|
||||
|
||||
/* corgi audio machine driver */
|
||||
static struct snd_soc_card snd_soc_corgi = {
|
||||
.name = "Corgi",
|
||||
.dai_link = &corgi_dai,
|
||||
.num_links = 1,
|
||||
};
|
||||
|
||||
|
||||
Machine Power Map
|
||||
-----------------
|
||||
|
||||
The machine driver can optionally extend the codec power map and to become an
|
||||
audio power map of the audio subsystem. This allows for automatic power up/down
|
||||
of speaker/HP amplifiers, etc. Codec pins can be connected to the machines jack
|
||||
sockets in the machine init function.
|
||||
|
||||
|
||||
Machine Controls
|
||||
----------------
|
||||
|
||||
Machine specific audio mixer controls can be added in the DAI init function.
|
95
Documentation/sound/alsa/soc/overview.txt
Normal file
95
Documentation/sound/alsa/soc/overview.txt
Normal file
|
@ -0,0 +1,95 @@
|
|||
ALSA SoC Layer
|
||||
==============
|
||||
|
||||
The overall project goal of the ALSA System on Chip (ASoC) layer is to
|
||||
provide better ALSA support for embedded system-on-chip processors (e.g.
|
||||
pxa2xx, au1x00, iMX, etc) and portable audio codecs. Prior to the ASoC
|
||||
subsystem there was some support in the kernel for SoC audio, however it
|
||||
had some limitations:-
|
||||
|
||||
* Codec drivers were often tightly coupled to the underlying SoC
|
||||
CPU. This is not ideal and leads to code duplication - for example,
|
||||
Linux had different wm8731 drivers for 4 different SoC platforms.
|
||||
|
||||
* There was no standard method to signal user initiated audio events (e.g.
|
||||
Headphone/Mic insertion, Headphone/Mic detection after an insertion
|
||||
event). These are quite common events on portable devices and often require
|
||||
machine specific code to re-route audio, enable amps, etc., after such an
|
||||
event.
|
||||
|
||||
* Drivers tended to power up the entire codec when playing (or
|
||||
recording) audio. This is fine for a PC, but tends to waste a lot of
|
||||
power on portable devices. There was also no support for saving
|
||||
power via changing codec oversampling rates, bias currents, etc.
|
||||
|
||||
|
||||
ASoC Design
|
||||
===========
|
||||
|
||||
The ASoC layer is designed to address these issues and provide the following
|
||||
features :-
|
||||
|
||||
* Codec independence. Allows reuse of codec drivers on other platforms
|
||||
and machines.
|
||||
|
||||
* Easy I2S/PCM audio interface setup between codec and SoC. Each SoC
|
||||
interface and codec registers its audio interface capabilities with the
|
||||
core and are subsequently matched and configured when the application
|
||||
hardware parameters are known.
|
||||
|
||||
* Dynamic Audio Power Management (DAPM). DAPM automatically sets the codec to
|
||||
its minimum power state at all times. This includes powering up/down
|
||||
internal power blocks depending on the internal codec audio routing and any
|
||||
active streams.
|
||||
|
||||
* Pop and click reduction. Pops and clicks can be reduced by powering the
|
||||
codec up/down in the correct sequence (including using digital mute). ASoC
|
||||
signals the codec when to change power states.
|
||||
|
||||
* Machine specific controls: Allow machines to add controls to the sound card
|
||||
(e.g. volume control for speaker amplifier).
|
||||
|
||||
To achieve all this, ASoC basically splits an embedded audio system into
|
||||
multiple re-usable component drivers :-
|
||||
|
||||
* Codec class drivers: The codec class driver is platform independent and
|
||||
contains audio controls, audio interface capabilities, codec DAPM
|
||||
definition and codec IO functions. This class extends to BT, FM and MODEM
|
||||
ICs if required. Codec class drivers should be generic code that can run
|
||||
on any architecture and machine.
|
||||
|
||||
* Platform class drivers: The platform class driver includes the audio DMA
|
||||
engine driver, digital audio interface (DAI) drivers (e.g. I2S, AC97, PCM)
|
||||
and any audio DSP drivers for that platform.
|
||||
|
||||
* Machine class driver: The machine driver class acts as the glue that
|
||||
decribes and binds the other component drivers together to form an ALSA
|
||||
"sound card device". It handles any machine specific controls and
|
||||
machine level audio events (e.g. turning on an amp at start of playback).
|
||||
|
||||
|
||||
Documentation
|
||||
=============
|
||||
|
||||
The documentation is spilt into the following sections:-
|
||||
|
||||
overview.txt: This file.
|
||||
|
||||
codec.txt: Codec driver internals.
|
||||
|
||||
DAI.txt: Description of Digital Audio Interface standards and how to configure
|
||||
a DAI within your codec and CPU DAI drivers.
|
||||
|
||||
dapm.txt: Dynamic Audio Power Management
|
||||
|
||||
platform.txt: Platform audio DMA and DAI.
|
||||
|
||||
machine.txt: Machine driver internals.
|
||||
|
||||
pop_clicks.txt: How to minimise audio artifacts.
|
||||
|
||||
clocking.txt: ASoC clocking for best power performance.
|
||||
|
||||
jack.txt: ASoC jack detection.
|
||||
|
||||
DPCM.txt: Dynamic PCM - Describes DPCM with DSP examples.
|
79
Documentation/sound/alsa/soc/platform.txt
Normal file
79
Documentation/sound/alsa/soc/platform.txt
Normal file
|
@ -0,0 +1,79 @@
|
|||
ASoC Platform Driver
|
||||
====================
|
||||
|
||||
An ASoC platform driver class can be divided into audio DMA drivers, SoC DAI
|
||||
drivers and DSP drivers. The platform drivers only target the SoC CPU and must
|
||||
have no board specific code.
|
||||
|
||||
Audio DMA
|
||||
=========
|
||||
|
||||
The platform DMA driver optionally supports the following ALSA operations:-
|
||||
|
||||
/* SoC audio ops */
|
||||
struct snd_soc_ops {
|
||||
int (*startup)(struct snd_pcm_substream *);
|
||||
void (*shutdown)(struct snd_pcm_substream *);
|
||||
int (*hw_params)(struct snd_pcm_substream *, struct snd_pcm_hw_params *);
|
||||
int (*hw_free)(struct snd_pcm_substream *);
|
||||
int (*prepare)(struct snd_pcm_substream *);
|
||||
int (*trigger)(struct snd_pcm_substream *, int);
|
||||
};
|
||||
|
||||
The platform driver exports its DMA functionality via struct
|
||||
snd_soc_platform_driver:-
|
||||
|
||||
struct snd_soc_platform_driver {
|
||||
char *name;
|
||||
|
||||
int (*probe)(struct platform_device *pdev);
|
||||
int (*remove)(struct platform_device *pdev);
|
||||
int (*suspend)(struct platform_device *pdev, struct snd_soc_cpu_dai *cpu_dai);
|
||||
int (*resume)(struct platform_device *pdev, struct snd_soc_cpu_dai *cpu_dai);
|
||||
|
||||
/* pcm creation and destruction */
|
||||
int (*pcm_new)(struct snd_card *, struct snd_soc_codec_dai *, struct snd_pcm *);
|
||||
void (*pcm_free)(struct snd_pcm *);
|
||||
|
||||
/*
|
||||
* For platform caused delay reporting.
|
||||
* Optional.
|
||||
*/
|
||||
snd_pcm_sframes_t (*delay)(struct snd_pcm_substream *,
|
||||
struct snd_soc_dai *);
|
||||
|
||||
/* platform stream ops */
|
||||
struct snd_pcm_ops *pcm_ops;
|
||||
};
|
||||
|
||||
Please refer to the ALSA driver documentation for details of audio DMA.
|
||||
http://www.alsa-project.org/~iwai/writing-an-alsa-driver/
|
||||
|
||||
An example DMA driver is soc/pxa/pxa2xx-pcm.c
|
||||
|
||||
|
||||
SoC DAI Drivers
|
||||
===============
|
||||
|
||||
Each SoC DAI driver must provide the following features:-
|
||||
|
||||
1) Digital audio interface (DAI) description
|
||||
2) Digital audio interface configuration
|
||||
3) PCM's description
|
||||
4) SYSCLK configuration
|
||||
5) Suspend and resume (optional)
|
||||
|
||||
Please see codec.txt for a description of items 1 - 4.
|
||||
|
||||
|
||||
SoC DSP Drivers
|
||||
===============
|
||||
|
||||
Each SoC DSP driver usually supplies the following features :-
|
||||
|
||||
1) DAPM graph
|
||||
2) Mixer controls
|
||||
3) DMA IO to/from DSP buffers (if applicable)
|
||||
4) Definition of DSP front end (FE) PCM devices.
|
||||
|
||||
Please see DPCM.txt for a description of item 4.
|
52
Documentation/sound/alsa/soc/pops_clicks.txt
Normal file
52
Documentation/sound/alsa/soc/pops_clicks.txt
Normal file
|
@ -0,0 +1,52 @@
|
|||
Audio Pops and Clicks
|
||||
=====================
|
||||
|
||||
Pops and clicks are unwanted audio artifacts caused by the powering up and down
|
||||
of components within the audio subsystem. This is noticeable on PCs when an
|
||||
audio module is either loaded or unloaded (at module load time the sound card is
|
||||
powered up and causes a popping noise on the speakers).
|
||||
|
||||
Pops and clicks can be more frequent on portable systems with DAPM. This is
|
||||
because the components within the subsystem are being dynamically powered
|
||||
depending on the audio usage and this can subsequently cause a small pop or
|
||||
click every time a component power state is changed.
|
||||
|
||||
|
||||
Minimising Playback Pops and Clicks
|
||||
===================================
|
||||
|
||||
Playback pops in portable audio subsystems cannot be completely eliminated
|
||||
currently, however future audio codec hardware will have better pop and click
|
||||
suppression. Pops can be reduced within playback by powering the audio
|
||||
components in a specific order. This order is different for startup and
|
||||
shutdown and follows some basic rules:-
|
||||
|
||||
Startup Order :- DAC --> Mixers --> Output PGA --> Digital Unmute
|
||||
|
||||
Shutdown Order :- Digital Mute --> Output PGA --> Mixers --> DAC
|
||||
|
||||
This assumes that the codec PCM output path from the DAC is via a mixer and then
|
||||
a PGA (programmable gain amplifier) before being output to the speakers.
|
||||
|
||||
|
||||
Minimising Capture Pops and Clicks
|
||||
==================================
|
||||
|
||||
Capture artifacts are somewhat easier to get rid as we can delay activating the
|
||||
ADC until all the pops have occurred. This follows similar power rules to
|
||||
playback in that components are powered in a sequence depending upon stream
|
||||
startup or shutdown.
|
||||
|
||||
Startup Order - Input PGA --> Mixers --> ADC
|
||||
|
||||
Shutdown Order - ADC --> Mixers --> Input PGA
|
||||
|
||||
|
||||
Zipper Noise
|
||||
============
|
||||
An unwanted zipper noise can occur within the audio playback or capture stream
|
||||
when a volume control is changed near its maximum gain value. The zipper noise
|
||||
is heard when the gain increase or decrease changes the mean audio signal
|
||||
amplitude too quickly. It can be minimised by enabling the zero cross setting
|
||||
for each volume control. The ZC forces the gain change to occur when the signal
|
||||
crosses the zero amplitude line.
|
Loading…
Add table
Add a link
Reference in a new issue