This is the Main Layout of a sitemap.xml File
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"> <url> <loc>https://youtsitedomain/url</loc> <lastmod>2020-08-25T00:44:15+00:00</lastmod> <priority>1.00</priority> </url> <url> <loc>https://youtsitedomain/...</loc> <lastmod>2020-08-25T00:44:15+00:00</lastmod> <priority>1.00</priority> </url> </urlset>
Repeat the <url> block for every page of your site. The best way if to let a cronjob generate a new sitemap file
every X days with all available sites refering to content listed on this map. The priority tag will mostly be ignored by
crawlers. The lastmod argument tells a crawler the last time a site has been edited. The following Code could be the Layout of your robots.TXT file linking to the sitemap. This robots file is configured to allow crawlers to crawl all pages on your site:
Sitemap: https://yoursitedomain/sitemap.xml
User-Agent: *
Allow: /