Ed Santo
๐ค SpeakerAppearances Over Time
Podcast Appearances
I've been holding off, like handing him the kind of magical keys to the Hyundai.
That's fair.
Yeah, yeah.
But let's say I give him the key and he does what he will do, which is immediately crash it.
Is that Joseph's fault or is it my fault?
I mean, clearly that's my fault.
That's my responsibility.
And so the first answer to your question is that companies who choose to use AI that's opaque, that hasn't been tested properly, all of that sort of thing, that is on them.
It's their responsibility to make sure that the technology is safe before they use it on the community.
But there is that second point, right?
So if the technology itself that is developed by those big tech companies, mostly overseas, is itself defective or somehow kind of negligently created or something like that, then of course that is on them.
And the beauty, I say this very much as a lawyer, the beauty of liability is that it's something that can be shared.
Yeah, there's a huge rise in employers monitoring and in some cases surveilling their workers.
Now, we think about that primarily in very bespoke ways.
Like maybe you're a truck driver and you have this kind of camera that's watching you to determine whether you're showing signs of fatigue.
actually, the real phenomenon is different.
So even some piece of technology like Microsoft Teams, which is almost ubiquitous, that actually has a whole bunch of settings built in that the employer can operate to actually start to monitor how you're working.
And what we're saying is there's good
totally legitimate forms of monitoring of workers, and we support that.