That's fair though it also highlights another issue: since these foundation models are trained on human generated content, the training data itself is also littered with these "ambiguous" problem statements that can bias the results. Most of the time, its a "feature not a bug" as people say things that are technically wrong but semantically understandable to other humans. Foundation models are something of a mixed beast
I wouldn't overindex on the increasing / decreasing issue, as that's a little ambiguous.
I'd use language that is opposite yours to describe changing a H2 to a H1 (increasing the level despite decreasing the number). According to the HTML spec (https://html.spec.whatwg.org/multipage/sections.html#headings-and-outlines), I'm technically wrong but wrong in a conceivably understandable way: thinking about levels as a nesting with the top level of the nesting being the highest level (albeit the smallest level number).
hey! our startup has also pivoted into this space recently. shoot me an email: chris@lunasec.io or you can join our discord https://discord.gg/znyraHeTBt
i think we're already connected :)
meeting up with free later this week
oh kevin! i didn’t even realize hahaha
That's fair though it also highlights another issue: since these foundation models are trained on human generated content, the training data itself is also littered with these "ambiguous" problem statements that can bias the results. Most of the time, its a "feature not a bug" as people say things that are technically wrong but semantically understandable to other humans. Foundation models are something of a mixed beast
Super interesting! 1st example I see of the complete process. Inspiring.
I wouldn't overindex on the increasing / decreasing issue, as that's a little ambiguous.
I'd use language that is opposite yours to describe changing a H2 to a H1 (increasing the level despite decreasing the number). According to the HTML spec (https://html.spec.whatwg.org/multipage/sections.html#headings-and-outlines), I'm technically wrong but wrong in a conceivably understandable way: thinking about levels as a nesting with the top level of the nesting being the highest level (albeit the smallest level number).