I had a website that was set up for only my personal use. According to the logs the only activity I ever saw was my own. However, it involves a compromise. Obscurity at the cost of accessibility and convenience.
First, when I set up my SSL cert, I chose to get a wildcard subdomain cert. That way I could use a random subdomain name and it wouldn't show up on https://crt.sh/
Second, I use an uncommon port. My needs are very low so I don't need to access my site all the time. The site is just a fun little hobby for myself. That means I'm not worried about accessing my site through places/businesses that block uncommon ports.
Accessing my site through a browser looks like: https//randomsubdomain.domainname.com:4444/
I'm going on the assumption that scrapers and crawlers are going to be searching common ports to maximize the number of sites they can access over wasting their time on searching uncommon ports.
If you are hosting on common ports (80, 443) then this isn't going to be helpful at all and would likely require some sort of third party to manage scrapers and crawlers. For me, I get to enjoy my tiny corner of the internet with minimal effort and worry. Except my hard drive died recently so I'll pick up again in January when I am not focused on other projects.
I'm sure given time, something will find my site. The game I'm playing is seeing how long it would take to find me.
Would using chicken wire as a faraday cage in the sex dungeon be considered kinky and functional?