Email or username:

Password:

Forgot your password?
Top-level
Catherine Berry

@J12t

So an extension of robots.txt, essentially. "Only read this if your planned use of the data conforms with <license>." And crawlers could be made aware of what licenses they should not index.

Of course, this relies on compliance under threat of a lawsuit if a crawler operator is caught misusing your data. My sense is that this would scare off only small companies. Large companies have enough good lawyers to reliably dodge lawsuits, while for state actors, lawsuits aren't a threat at all.

1 comment
Johannes Ernst

@isomeme robots.txt++++. Much more detailed, and on a per-post basis, not site-wide.

And ideally, to avoid a DNT-style let’s-just-ignore-it disaster, ideally the whole thing would be subject to a Ricardian contract framework that anchors the post markup in contract law.

Go Up