@brianonbarrington @Wolven Ah, OK. An *attempt* to cut out the human, initially forgetting that you don't actually have any non-human data to fit to.
Are people often discussing the problem of time here by the way? If you're fitting to 30-year old data, you're fitting to a different society. 30 years ago is not a very reliable guide to what's going on in any specific neighbourhood (at least not here in Helsinki).
@TorbjornBjorkman @brianonbarrington @Wolven How else should an AI/algorithm be trained? If not on existing datasets (the larger the better) then how?
The cost of orienting a tool like this to an otherwise human task would likely be enormous, no? And also likely wouldn’t solve the problem.
The discrepancy is knowledge/ understanding of human bias which requires awareness/acknowledgement of bias.
How easy is it to assemble a team of engineers who are versed in such things? A team skilled in countering such things might well require fundamentally diverse background and experience but how does that kind of approach square with a typical management team or, indeed the culture more broadly.
Seems a lot like a paradigmatic shift is required.
@TorbjornBjorkman @brianonbarrington @Wolven How else should an AI/algorithm be trained? If not on existing datasets (the larger the better) then how?
The cost of orienting a tool like this to an otherwise human task would likely be enormous, no? And also likely wouldn’t solve the problem.
The discrepancy is knowledge/ understanding of human bias which requires awareness/acknowledgement of bias.