Hiding my website from google search results

Hello everyone,

I have a website that I don’t want it to be shown from google searches.
I only want people with the link to access it. I know there is a way to add password to the website but that doesn’t look good.

I did a quick google search and the suggestion was to create a robots.txt file and upload it to the root directory of my project.

Is this the right way for what I want? I wasn’t understanding what it meant by removing page from index.

Has anyone done this? If there is another way of doing it, please let me know!!

Thank you for your time!

Here is my site Read-Only: LINK
(how to share your site Read-Only link)

Yes, you can use a robots.txt file to prevent your website from appearing in Google search results. This file instructs search engine bots which pages to crawl and index and which ones to ignore.

To block search engines from crawling your entire website, you can add the following lines to your robots.txt file:

User-agent: *
Disallow: /

This tells all user agents (i.e., search engine bots) to stay away from all pages on your website. However, it’s worth noting that while this will prevent your website from appearing in search results, it won’t prevent people from accessing it if they have the direct URL.

I would also add an important caveat that is that the robots.txt method only instructs a bot what you want to be crawlable or not, but it is up to the bot to honor your exclusion(s). The big companies do but some smaller nefarious ones don’t. Also won’t stop someone from scraping your page(s) and showing them elsewhere where they may be indexed, but that is a separate issue.