I think you have a wrong understanding of software auditing. Software can be closed source and 3rd party auditors can assess if it has good privacy and security implementations.
Being closed source doesn't necesarily mean it's bad (for privacy/security).
But then you have to trust, 1, the auditors (I assume by your comment you mean the people given closed door access to the code, reviews it, then publishes a statement saying their claims are valid, that kind of third party auditing?); 2, the code they disclosed ro the auditors is the actual complete; 3, that between the current version and the next they did not add anything fishy; and last but not least, 4, the binaries they give you is actually built from that codebase and nothing else, since you can't build it yourself if you're really that worried.
I don't fully disagree that you can have a private and secure third party app, sure you can, but I argue that there are some really big hurdles and you can never have 100% trust in it. Whether these things is a dealbreaker depends on your own values, opinions, and threat model, of course.