I've implemented a few of these and that's about the most lazy implementation possible. That system prompt must be 4 words and a crayon drawing. No jailbreak protection, no conversation alignment, no blocking of conversation atypical requests? Amateur hour, but I bet someone got paid.
(Assuming US jurisdiction) Because you don't want to be the first test case under the Computer Fraud and Abuse Act where the prosecutor argues that circumventing restrictions on a company's AI assistant constitutes
ntentionally ... Exceed[ing] authorized access, and thereby ... obtain[ing] information from any protected computer
Granted, the odds are low YOU will be the test case, but that case is coming.