Update 3:55 pm ET: We've added a comment from a Microsoft spokesperson.
While some recordings were initiated by "Xbox" or "Hey Cortana" voice commands, others were captured by mistake, according to a former contractor.
"Most of the Xbox related stuff I can recall doing was obviously unintentional activations with people telling Cortana 'No' as they were obviously in the middle of a game and doing normal game chat," a current contractor said to Motherboard.
This might remind you of the recent revelations about Apple's Siri. After several short, private audio snippets were picked up by what were likely unintended triggers, an anonymous Apple contractor came forward with concerns about lack of disclosure.
In both cases (and the cases of companies like Google and Amazon, too) these contractors were hired to rate and fine-tune the voice assistants' command features and accuracy at recognizing and understanding human speech.
"As time went on, we got less apparently accidental stuff as the feature improved," a former contractor said.
The company also offers a way for users to delete any audio recordings saved from their devices.
When we asked Microsoft about the report, this was its response:
"We always get customer permission before collecting voice data, we take steps to de-identify voice snippets being reviewed to protect people’s privacy, and we require that handling of this data be held to the highest privacy standards in the law. At the same time, we’re actively working on additional steps we can take to give customers more transparency and more control over how their data is used to improve products.”