I find that an unfortunate choice of words. For instance, what this technology emphatically does not do, is protect against use of imagery from such a camera in AI training sets, contributing to future generation of AI imagery. It's still possible to strip or simply ignore the embedded metadata and use and modify the images in whatever way desired. This will of course result in the CAI certificate becoming broken, but if it's stripped from the image altogether, people will not even realize it was there to begin with.
This does make it somewhat easier for photographers to prove that an unaltered (or barely altered) image was created by them (and when), although this still requires active hunting down of illicit use/misuse.
I expect that this sort of technology will become common among photojournalists, with other brands except Leica joining the bandwagon, but that it will overall not make much of a change in how the general public sees, perceives and interprets photos, nor will it change anything about how AI imagery will continue to develop in the future.
I regard this as the photographic equivalent of an ISO-certification: as an integral part of a larger effort to guard the veracity of images it's sort of useful, but in itself, the feature doesn't do much. It relies on checks and balances elsewhere in the media chain to actually achieve something.