I understand that as a website/domain owner (eg: example.com), we can prevent images from our website from being indexed by Google (GoogleBot / GoogleBot-Image).
We could add a meta tag to the whole page (which blocks indexing of the entire page. Not what we want.):
<meta name="robots" content="noindex" />
Or we could add into the robots.txt to just block images:
User-agent: Googlebot-Image Disallow: /images/
However, the image that I'd like to block is an image that is hosted on S3 / Cloudinary. Unlike the examples given by Google, the path to the image does not belong to my domain (example.com), instead, it has an Cloudinary address path.
eg: I have a page
example.com/cute-cats. Inside, it has an image of a cat that belongs to me, but is uploaded and hosted in Cloudinary (eg: https://res.cloudinary.com/xyz123/image/upload/xxxyyyzzz111222/cute-cat.jpg).
How do I remove such an image if it already has been indexed and shows up in search results? The image is currently used as a thumbnail on Google Images that links back to example.com/cute-cats.
How do I prevent these images from being indexed again in the future?