I would like to set up my own cloud at home using Nextcloud. Maybe later on my own Minecraft server, etc.
To get started first, I could use my FRITZ! Box forward the ports and then reach the services via my MyFRITZ address.
But that would be quite ugly or not very flexible. Therefore, I would like to put a reverse proxy (probably Nginx) in front of it and use DynDNS to think of addresses via which I can then easily reach the services.
It is clear that anyone who knows the addresses can then find out. But my question is more whether you can somehow find the addresses that I set up through Google, for example? So could one find out eg "http://www.MeineNextcloud.dyn.dns.com" somehow but guess?
Yes, assume that they are then public.
Sometimes you get some information about DNS. Typically you secure your cloud services with https, and if you use the right certificates, e.g. From let's encrypt, then they are published for "Certificate Transparency". Can you e.g. Query here: https://crt.sh/
So clearly, regardless of whether the services are at home or in the data center:
Offer properly with https…
and secure it sensibly with a username, password and 2FA
But that would be quite ugly or not very flexible. Therefore, I would like to put a reverse proxy (probably Nginx) in front of it and use DynDNS to think of addresses via which I can then easily reach the services.
You don't need a reverse proxy for this. You already have port forwarding. Your DynDNS address refers to your current IP address. The Fritzbox passes this through to your server according to port forwarding. Has been going on for me for years.
It is clear that anyone who knows the addresses can then find out.
Logical. That is why Nextcloud naturally has authentication and brute force protection.
But my question is more whether you can somehow find the addresses that I set up through Google, for example?
Google hardly finds anything on its own. Google finds most of it through links. So if your Nextcloud isn't linked anywhere, Google won't find it anytime soon. I used robots.txt to prevent indexing by search engines because my Nextcloud shouldn't even appear on Google. It's not a public service. Google and other search engines respect the robots.txt. Simply put a file with the name robots.txt in the root directory of your web server and enter there:
User agent: *
Disallow: /