I am several hundred opossums in a trench coat

  • 1 Post
  • 35 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle


  • How much computing power do you think it takes to approximately recognise a predefined word or phrase? They do that locally, on device, and then stream whatever audio follows to more powerful computers in AWS (the cloud). To get ahead of whatever conspiratorial crap you’re about to say next, Alexa devices are not powerful enough to transcribe arbitrary speech.

    Again, to repeat, people smarter than you and me have analysed the network traffic from Alexa devices and independently verified that it is not streaming audio (or transcripts) unless it has heard something close (i.e close enough such that the fairly primative audio processing (which is primitive because it’s cheap, not for conspiracy reasons) recognises it) to the wake word. I have also observed this, albeit with less rigorous methodology. You can check this yourself, why don’t you do that and verify for yourself whether this conspiracy holds up?






  • Can you explain to me exactly how moving where profit is recorded from one division to another in the same organization reduces their tax burden? Because, excuse me, I know I only did a year or two of accounting courses before dropping the degree, but that’s not how I understand taxes to work.

    Also to be turning a profit by “doing well collecting data”, the open market value of the data Alexa alone annually generates would need to be around 8% of the entire global data market. If you can justify how millions of instances of “Alexa set a timer for 10 minutes”, “Alexa what is the weather”, or “Alexa play despacito” generates that much value, maybe you have a point.



  • having an always on listening device in someone’s home

    They very explicitly do not collect audio when you haven’t used a wake word or activated it some other way. They will not “know what is discussed within the house for data on ad penetration/reach” (which is pretty much the only valuable data you’ve mentioned here), nor will they “have a backchannel to television viewing and music listening patterns” unless you actively discuss it with your device.

    I’m not going to put words in your mouth, but if whoever reads this is thinking of replying “are you going to trust that” etc, yes I am. We can track which data an Alexa transmits in real time and directly verify this “always listening” isn’t happening. Even if we couldn’t independently verify that his is the case, and lets say they contradict their privacy policy and public statements and do it anyway, that’s a crazy liability nightmare. Amazon has more than enough lawyers to know that unconsentually recording someone and using that data is very illegal in most places, and would open them up to so many lawsuits if they accidentally leaked or mishandled the data. Take the conspiracy hat off and put your thinking cap on.

    Send it to cheap overseas transcribers, use it to train and improve voice recognition and automatic transcription.

    Bad for privacy, but also not a $25 billion dollar source of revenue.

    Alexa, Google Home, and Siri devices are not good sources of data. If they were, why would Google, king of kings when it comes to data collection, be cutting their Assistant teams so much?









  • Ok, so they do that. Here are some things that can plausibly go wrong:

    • Are the people posting the story funding thing anonymous? Because if they are, no one will fund it based on a one line description with no details. If the authors are known, any company engaging in the practice will be watching them like a hawk (essentially making investigation impossible)
    • The company engaging in the practice assumes the investigation is aimed at them and temporarily stops double billing until the journalists runs out of budget and everything blows over. They then resume double billing.
    • The company engaging in the practice assumes the investigation is aimed at them and consequently intimidates would-be whistleblowers into staying silent, basically preventing any progress
    • The company intentionally floods “Kickstarter for News” with spurious stories to drown out the item about them
    • The story isn’t funded because it doesn’t agree with the preconceived notion of enough users, who are only willing to fund content matching their own worldview
    • The story isn’t funded because, while people find it is important, more attention was placed on a story that agreed with the preconceived notion of enough users
    • What stories are funded have a huge bias towards the material condition of the wealthy (moreso than now), since they are the only ones with enough disposable income to fund content. Therefore, content focused on the conditions of the poor and marginalised is ironically marginalised
    • Unable to be subsidized by less prestigious entertainment content (like traditional investigative journalism was), the required upfront cost for stories balloons to a size not feasibly collected by donations
    • The wider population becomes apathetic to the platform as a whole (people have actual jobs and lives, and may not have the time to trawl through potential stories for something they want to fund), leaving only the extremely wealthy/powerful to fund stories. As a consequence the media is even more controlled by the elite than it is currently
    • It turns out there was never a story, and those that donated feel burned and are less likely to donate in the future
    • It turns out there was never a story, and, feeling pressure to produce something, the journalists intentionally misconstrue the truth

    I think a crowdsourced approach is a great idea, but only in the sense that my tax dollars go to independent news organisations.




  • Likewise, an open source project can totally die if they refuse to engage with the needs of the users. The lack of moderation and content management tools have been a longstanding criticism of Lemmy, and instances will migrate to alternatives that address these concerns. It is a genuine legal liability for instance operators if they are unable to sufficiently delete CSAM/illegal content or comply with EU regulations.