No Audio In When Not Default Device And Sound Control Panel Closed
I am attempting to record audio from multiple soundcards at the same time using NAudio and WASAPI. If a soundcard is set as the default record device, everything is fine on that NAudio capture but there is no audio on the other NAudio captures. If I open the Sound control panel and select the Record tab, everything works on all captures. As soon as I select a different tab or close the control panel, the audio goes away and the peak meter values are 0. Any ideas? Thanks.
See also questions close to this topic
-
Android : How to Get the amplitude and Frequency of a sound in a .wav file at particular Second
I am working on an Android Project in which , Let's Say I have an audio file in .wav format
I want to create an array in which i want to set three fields : 1. Time (in Seconds or in Mili Seconds) of audio file. 2. Amplitude at that particular Second of audio file. 3. Frequency at that particular Second of audio file.
I am thinking that i can use "for OR while" loop to create the array by getting amplitude and frequency at each second and save add it to my array list.
But I don't know how to get the value of amplitude and freuency at that particular Second from my audio file.
I never worked with audios before, So I don't know , Is there any method to get these values.
I am thinking that as amplitude can be represented on graphs of amplitude and time, amplitude can be an integer or long ( I actually Don't Know ).
Is it possible ? If Yes than How can I achieve this.
Thank You In Advance :-)
-
How to play beep in safari 11
I have a text reader/pic player that has text-to-voice playing underneath. This works fine, but when a new pic loads, I want it to "beep" so the user knows when to look at the screen.
<img key="pic0" src={this.state.images[0].src} className="pic" onLoad={this.beep} /> <audio id="beep" autoPlay={false} > <source src="images/beep-07.mp3" type="audio/mpeg" /> </audio> function beep() { var sound = document.getElementById("beep"); sound.play(); }
THIS USED TO WORK UNTIL SAFARI 11! It works in Chrome. What has changed?
-
Adjust sound settings with simple script or tool
I want a helpful script that helps me to adjust sound settings this adjustment means: 1- set the default device which is the USB-speakers i am connecting to pc now ( i found an Open source called NirCmd Which help me on this) 2- I want it to disable sidetone from the speaker properties > levels and put the volume(receiver) to the highest value 3-enable the loudness equalization 4-for the recording device put it on 70%
is there any script or code that can help on this or one-click tool that applies these settings after configuring it with this settings.
-
Using C# NAudio to open an Audio Stream and save that Audio Stream back in Small Chunks
I have an audio stream in 4 formats (.asx, .m3u, .mp3 and .pls) and I would like to read that audio stream into C# NAudio and convert that audio into small 15 second chunks.
String url = ""; DateTime start = DateTime.Now; using (Stream ms = new MemoryStream()) { using (Stream stream = WebRequest.Create(url).GetResponse().GetResponseStream()) { byte[] buffer = new byte[32768]; int read; WaveFormat waveFormat = new WaveFormat(8000, 8, 2); using (WaveFileWriter writer = new WaveFileWriter(@"C:\\Users\\Public\\Music\\Sample Music\\Test.wav", waveFormat)) { while ((read = stream.Read(buffer, 0, buffer.Length)) > 0) { ms.Write(buffer, 0, read); writer.Write(buffer, 0, read); var a = DateTime.Now - start; if (a.TotalSeconds >= 15) { break; } } } } }
Each piece of audio created in the following function comes back as static.
- How should I set up the WaveFormat?
- Why is it complete static?
-
C#_Using NAudio to implement STFT
This is the first time I try to analyse audio stream on C#. Without any experience, it will be grateful that you share your solution. :)
Using NAudio(an opensource for C# dealing with audio), I try to compare two audio stream with the following stages:
Read in a .wav file and store in float[] in 16 Bits:
audio = new AudioFileReader("test.wav"); _buffer = new float[wave.length]; audio.Read (_buffer, 0, _buffer.Length);
Pass the
float[]
toSmbPitchShifter()
, and set thePitchShift=1
(not change the pitch), set the osamp as the hop size, and pass the _buffer.sps = new SmbPitchShifter(); sps.PitchShift(1, _buffer.Length, 2048, 32, 44100, _buffer);
Last, compare two STFT float[] using DTW algorithm.
Now, I am stucked at the stage 2, and have some questions:
1. I do not exactly know what is my sampleRate, but set it to 44100 as default...
2. After the above code, I got a new _buffer float[], with dater after stage2 1: https://imgur.com/a/VLmHJ as I know the data after STFT should be [-1.0, 1.0). What step did I miss, or I am completely wrong at the beginning?
3. I have no idea what window is NAudio used.NAudio.Dsp.SmbPitchShifter.CS from github These bothered me for a long time. As you can tell I am new in audio analyse, and i am really appreciate for your advice. Have a good day. -
How do I transpose the pitch of a .wav in .net C#
I'm trying to create a small utility application (console app or winforms) in C# that will load a single cycle waveform. The user can then enter a chord, and the app will generate a "single cycle chord", ie: a perfectly loopable version of the chord. A perfect fifth (is not a chord but 2 notes) for instance would be the original single cycle looped twice, mixed with a second copy of the single cycle, transposed 7 semitones up and looped 3 times in the same timeframe.
What I can't find is how to transpose the wave simply by playing it faster, just like most basic samplers do. What would be a good way to do this in C#?
I've tried NAudio and cscore, but can't find how to play the wave at a different pitch by playing it faster. The pitch shift and varispeed examples are not the thing I'm looking for because those either try to keep the length the same or try to keep the pitch the same.
-
How to do Wasapi render using the data captured
Hi just ran into below thread and that specifically is my query. Could anyone please help me on MyAudioClass class and how to use that for rendering the audio which has already been captured using wasapi capture? Thanks.
-
Is the eventhandler in WPF run on the same thread as the event?
Still trying to get a good grasp on multithreading in WPF. When an eventhandler is attached to an event, is it (the eventhandler) run on the same thread as the event?
For example, in NAudio, it is my understanding that the WasapiCapture works directly with the microphone device driver: (Code taken from GitHub):
capture = new WasapiCapture(SelectedDevice); capture.ShareMode = ShareModeIndex == 0 ? AudioClientShareMode.Shared : AudioClientShareMode.Exclusive; capture.WaveFormat = SampleTypeIndex == 0 ? WaveFormat.CreateIeeeFloatWaveFormat(sampleRate, channelCount) : new WaveFormat(sampleRate, bitDepth, channelCount); currentFileName = String.Format("NAudioDemo {0:yyy-MM-dd HH-mm-ss}.wav", DateTime.Now); RecordLevel = SelectedDevice.AudioEndpointVolume.MasterVolumeLevelScalar; // In NAudio, StartRecording will start capturing audio from an input device. // This gets audio samples from the capturing device. It does not record to an audio file. capture.StartRecording(); capture.RecordingStopped += OnRecordingStopped; capture.DataAvailable += CaptureOnDataAvailable;
When the data buffers are full (?), the WasapiCapture raises the DataAvailable event to which the CaptureOnDataAvailable eventhandler is attached. In order to have the event handler, CaptureOnDataAvailable, update the UI, I've had to use the synchronizationContext:
synchronizationContext = SynchronizationContext.Current;
and then:
private void CaptureOnDataAvailable(object sender, WaveInEventArgs waveInEventArgs) { // I'M GUESSING I AM IN A BACKGROUND THREAD HERE? // Here, I am scheduling a UI update on the UI thread. synchronizationContext.Post(s => UpdateGraph(waveInEventArgs), null); }
So, am I correct to assume that an event handler will be run on the same thread it is attached to?
TIA