I’ve written about how Large Language Model AI is great for generating text but it’s not necessarily accurate when it comes to facts, because its essentially just plugging in grammatically correct words. When people don’t understand this, then there can be some embarrassing failures like lawyers producing case numbers that don’t exist.
A scary idea is students getting expelled for plagiarism because AI was asked to be the judge and claimed false credit.
Report of a teacher who seems to think running portions of a student’s essay through ChatGPT will allow ChatGPT to tell them if the AI wrote the paper. ChatGPT lies and will just give you an answer that may not be true!
https://x.com/paleofuture/status/1658227685548785665
Students need to be aware that if they use AI, then teachers can too.
Recently, we were using a third party API and realised it was accepting a message that it should reject because the content wasn’t using the correct digital signature. We were previously using the Sandbox environment which we thought should basically have no validation, but we were then sending the same message to the Integration environment and expected a Rejected status.
So we logged a helpdesk ticket with them and quickly got a reply back. It was a long-winded answer and basically agreed with us it was incorrect behaviour. The wording was clearly AI. But at least they admitted they had just used AI.
It might have been the case that their technical staff were unavailable so a manager made an attempt to deal with it without actually reading or understanding the problem at all.
I have just ran this through the chatbot and have had the following response: The details provided do not specifically mention the behavior of the Order endpoint in relation to the signature from the example request, especially in the context of different environments such as INT. However, it is common in software development to have different configurations for different environments such as development, testing (often referred to as int for integration), staging, and production. In many cases, the integration or testing environment is configured to be more permissive to facilitate testing and debugging, which might include accepting a static example signature for all requests. This would not be the case in a production environment, where security and proper verification of digital signatures are critical for ensuring non-repudiation and the integrity of the order. Based on standard practices, if the order endpoint is accepting a specific example signature in an integration environment, it could be by design for testing purposes. However, for a production environment, this would be a significant security risk and would not be by design. Since the provided documentation does not explicitly confirm the intentional behavior for the order endpoint in the int environment, it is advisable to refer to the specific implementation details or consult with the development team responsible for the API to understand the intended behavior. Can you confirm if this answers your query, please?