Stop struggling with Google Search Console errors. Get your sitemap indexed properly and watch your organic traffic grow.
Try IndexJump Free
Explore our expertise in solving sitemap and indexing issues
Sitemaps play a crucial role in the structure and visibility of a website. They provide search engines with a roadmap to all relevant pages, ensuring that content is indexed efficiently and accurately. Without a properly functioning sitemap, search engines may miss important pages or experience difficulties in crawling a website's structure.
When a sitemap cannot be read, it can lead to reduced visibility in search results, delays in indexing new content, and even errors in search engine diagnostics. Addressing this issue promptly is essential for maintaining and improving a website's SEO performance.
One of the most frequent reasons for this error is an improperly formatted sitemap. Sitemaps need to adhere to XML standards, including proper syntax, encoding, and structure. If the sitemap contains syntax errors or incompatible elements, search engines may fail to parse or read it.
Server outages, misconfigured permissions, or network issues can prevent search engines from accessing the sitemap file. If the sitemap URL is inaccessible due to these issues, the error will occur.
The URL provided for the sitemap might be incorrect or outdated. If the specified location does not exist or the link is broken, search engines cannot retrieve it.
Incorrect permissions, such as restrictive access controls or firewall settings, can block search engines from downloading the sitemap file. Ensuring that the sitemap is publicly accessible is essential.
Large sitemaps exceeding search engine limits or not properly compressed may cause read errors. It is important to segment extensive sitemaps into smaller ones and use gzip compression where appropriate.
The first step is to verify that the sitemap is XML-compliant and free of syntax errors. Online tools such as XML validators can assist in checking the correctness of the file before submission.
Ensure the URL provided for the sitemap is correct and accessible via a web browser. Confirm there are no typos and that the server responds with a 200 OK status code.
Make sure the sitemap file is stored in a publicly accessible directory and that permissions are configured correctly. Avoid restrictions that could block search engine crawlers.
Utilize tools like Google Search Console to submit the sitemap and receive detailed reports. If errors persist, the Search Console often provides specific guidance on fixing issues.
If the sitemap is large, split it into smaller, manageable files and update the sitemap index accordingly. This improves the likelihood of successful reading and indexing.
Keep your sitemap updated with new content and remove outdated pages. Regular monitoring ensures it remains accurate and accessible.
Stick to the official XML Sitemap protocol and always gzip compress large files to enhance performance and readability.
Set up alerts for server issues or errors related to sitemap accessibility to respond swiftly and minimize impact.
Addressing a "sitemap could not be read" error involves checking file format, verifying URL correctness, ensuring accessibility, and maintaining server configurations. By following best practices, website owners can ensure their sitemaps serve as effective tools for search engine indexing and overall site health.
Google Search Console (GSC) is an essential tool for website owners and SEO professionals. It provides insights into how Google indexes your site, including the status of your sitemaps. Occasionally, users encounter a message indicating that their sitemap "could not be read." This notification can be concerning, but understanding the causes and solutions can help restore proper sitemap submission and ensure your website's information is accurately indexed.
A sitemap is an XML file that lists all the important pages on your website. It acts as a roadmap for search engines, helping them discover and crawl content more effectively. Sitemaps improve the visibility of new or updated pages and can include metadata such as publication dates and priority levels. Properly submitted sitemaps facilitate better SEO performance and ensure comprehensive indexing.
The sitemap file must conform to the XML standards specified by Google. If the format is invalid or contains errors, GSC cannot read it. Common issues include syntax errors, missing required tags, or improper nesting.
The sitemap file must be accessible by Googlebot. If the file is blocked by robots.txt, password protected, or hosted on a server with restrictions, GSC will be unable to read it.
The URL submitted in GSC might be mistyped, outdated, or pointing to a non-existent file, resulting in errors when Google attempts to retrieve the sitemap.
If your server returns errors such as 404 (Not Found) or 500 (Server Error), Google cannot access the sitemap during its crawls.
Sitemaps exceeding Google’s limits (50,000 URLs or 50MB uncompressed) require splitting into multiple sitemaps or using a sitemap index. Failure to do so can cause reading errors.
The server should serve the sitemap with the correct 'application/xml' or 'text/xml' content-type. Mismatched MIME types can lead to read errors.
When submitting your sitemap, ensure that:
Regular monitoring and maintenance of your sitemaps can prevent "could not be read" errors and improve your website’s SEO performance.
Having a sitemap is an essential aspect of website SEO, allowing search engines like Google to crawl and index your site efficiently. However, webmasters often encounter issues when submitting their sitemap, one of the most common being the "Google sitemap could not be read" error. This article aims to provide a comprehensive overview of this problem, its causes, and practical solutions to help you resolve it effectively.
A sitemap is a file that provides information about the pages, videos, and other files on your website, and the relationships between them. Search engines use sitemaps to understand your website’s structure, ensuring that all pages are discoverable and properly indexed. An up-to-date sitemap can significantly improve your site's SEO performance by making sure that new or updated content is recognized quickly.
Google supports several sitemap formats, but the most common is the XML sitemap, which must adhere to specific standards to be valid and readable by Google’s crawler. When there are issues with the sitemap file format or accessibility, Google may display errors such as "sitemap could not be read".
If the XML sitemap contains syntax errors, such as mismatched tags, incorrect encoding, or non-compliant structure, Google cannot parse the file. Validation errors are a frequent source of this problem.
The URL submitted to Google Search Console must be accurate and accessible. Typos, incorrect paths, or referencing a sitemap that no longer exists will result in read errors.
If your server is configured to block certain bots or IP ranges, or if there are server errors (e.g., 403 Forbidden or 500 Internal Server Error), Google cannot fetch your sitemap. Additionally, issues like server downtime or firewall restrictions can prevent access.
Sitemaps exceeding the size limits (50 MB uncompressed or 50,000 URLs) may cause Google to have trouble reading them, especially if they are not split into multiple smaller sitemaps using a sitemap index file.
If the sitemap file contains improper encoding or unsupported characters, parsing errors can occur. It’s best to use UTF-8 encoding without BOM (Byte Order Mark).
Use online XML validation tools or the built-in validators available in search console to check for syntax errors. These tools highlight issues that prevent Google from reading the sitemap properly.
Enter the sitemap URL in a browser to see if it loads correctly. Look for server errors, access issues, or unexpected redirects.
Server logs can reveal whether Googlebot has successfully attempted to fetch the sitemap and whether it encountered errors during the process.
The Search Console provides detailed error reports regarding sitemap submissions, including reasons for failure and recommendations for fixes.
Ensure your sitemap is well-formed XML, with all tags correctly closed and follow the sitemap protocol standards. Use XML schema validation tools to confirm compliance.
Double-check the URL for accuracy. Make sure it’s accessible via your browser and free of typos. If changes are needed, update the URL in Google Search Console.
If your website has thousands of URLs, break the sitemap into multiple smaller files, and create a sitemap index to list all of them. This adheres to Google’s size limits and improves readability.
Ensure the sitemap uses UTF-8 encoding without BOM and that all URLs and content within the sitemap are properly encoded.
If you’ve tried all the troubleshooting steps and still encounter issues, consider consulting an SEO expert or web developer. Complex server configurations, advanced security settings, or custom CMS integrations may require specialized knowledge to resolve.
Managing a website's visibility on search engines is crucial for attracting visitors and improving ranking. Google Search Console offers valuable insights into how your site is performing in search results, especially through the analysis of sitemaps. However, website owners often encounter issues such as the "search console sitemap could not be read" error. This problem can prevent Google from indexing your pages properly, impacting your website's search visibility.
A sitemap is a file that lists all the important pages of your website, helping search engines understand the site structure. It facilitates better crawling and indexing, ensuring that new or updated content appears promptly in search results. An XML sitemap is the most widely used format, and it typically resides at a standard location such as example.com/sitemap.xml.
This error appears in Google Search Console when Googlebot attempts to fetch and analyze your sitemap but fails to do so successfully. Common reasons include:
Recognizing these causes helps in troubleshooting and fixing the issue effectively.
Ensure that the URL submitted in Google Search Console matches the actual location of your sitemap file. Verify the URL by entering it directly into a browser. If the file does not load, update the URL in Search Console with the correct path.
If your server is temporarily down or experiencing high traffic, Googlebot may not be able to access the sitemap. Check your server status and ensure it's online and capable of handling requests.
A malformed XML file can cause reading errors. Use an XML validator to verify your sitemap's syntax and fix any errors identified.
Your robots.txt file may block Googlebot from accessing the sitemap. Review the robots.txt directives to ensure that the sitemap URL is not disallowed.
The sitemap file must be publicly accessible. Ensure it has the correct permissions set on your server to allow Googlebot to read it.
Use standard XML format compliant with Google's guidelines. Avoid using unsupported formats such as RSS, Atom, or HTML for sitemaps.
Leverage the latest sitemap protocols and tools to generate and maintain your sitemap. Consider XML Sitemap Index files if managing large websites with multiple sitemaps.
Integrate sitemap generation within your content management system or use plugins that automatically update the sitemap upon content changes.
Regularly review crawl statistics in Google Search Console to understand how search engines are interacting with your site and identify potential issues early.
Limit the size of individual sitemaps to 50MB and 50,000 URLs to ensure efficient crawling.
Encountering the "search console sitemap could not be read" error can be frustrating, but understanding the common causes and remedies can help you resolve the issue promptly. Regular maintenance, validation, and monitoring of your sitemap are essential practices to ensure smooth crawling and indexing by search engines. By following the troubleshooting steps outlined above, you can improve your site's visibility and ensure that your content reaches your audience effectively.
For Shopify store owners and developers, ensuring proper sitemap functionality is essential for effective search engine optimization (SEO) and site visibility. Occasionally, users may encounter the error message indicating that the shopify sitemap could not be read. This issue can hinder your website's indexing and impact your online presence. This article provides a comprehensive overview of the causes, troubleshooting steps, and best practices to resolve this common issue.
A sitemap is a file that lists all important pages of your website, allowing search engines to crawl and index your content efficiently. Shopify automatically generates a sitemap.xml file for your store, which includes links to products, collections, blog posts, and other pages. Search engines, such as Google, use this file to understand the structure of your site and prioritize crawling.
This error typically indicates that the search engine or third-party tools are unable to access or read your sitemap.xml file correctly. The underlying causes can include:
Ensure that the URL of your sitemap is correct. For Shopify, the default sitemap URL is typically https://yourstorename.com/sitemap.xml. Confirm that this URL is accessible by entering it directly into your browser.
Use online tools such as Google Search Console or third-party sitemap validators to check if your sitemap can be fetched without errors. These tools can identify issues like server errors, redirects, or blocked files.
Your robots.txt file can restrict search engines from accessing certain files, including your sitemap. Check that the sitemap URL is not disallowed in this file.
If your site is experiencing server issues or maintenance, access to your sitemap may be temporarily unavailable. Confirm that your server is operational and configured correctly.
Login to your Shopify admin panel and double-check that your sitemap link is correct. Usually, Shopify manages this automatically, but if you've customized URLs or using third-party apps, ensure the link points correctly to https://yourstorename.com/sitemap.xml.
Open the sitemap URL in a browser. If it loads properly, the issue may be with search engine access or external tools. If not, resolve server or URL issues accordingly.
In Shopify, robots.txt files are generated automatically and are limited in customization. However, if you've added custom code or apps, ensure they do not block the sitemap URL.
Sometimes, a simple cache clear or browser refresh can resolve temporary access issues. Also, re-submit your sitemap in Google Search Console and other search engines' webmaster tools after fixing issues.
Login to Google Search Console, navigate to the Sitemaps section, and resubmit your sitemap URL. This prompts Google to re-crawl and index your site content.
If your server is experiencing downtime or misconfigurations, work with your hosting provider to restore proper functionality. Consider upgrading hosting plans if necessary to ensure stable access.
If issues persist despite following troubleshooting steps, consider consulting with a Shopify expert or SEO specialist. They can diagnose complex server or configuration issues and help implement advanced solutions.
The inability to read a Shopify sitemap can significantly impact your store’s SEO performance. By systematically verifying the sitemap URL, ensuring accessibility, reviewing server settings, and making necessary adjustments, you can resolve this issue effectively. Regular monitoring and maintenance will help prevent future occurrences, keeping your site optimized for search engines and improving your online visibility.
In the world of website management and search engine optimization (SEO), an accurate and accessible sitemap is essential. It helps search engines crawl and index website content effectively. However, website owners sometimes encounter issues when their sitemap cannot be read, resulting in an error message such as "sitemap could not be read" or "couldn't fetch." This article provides a comprehensive overview of this problem, its common causes, and practical solutions to resolve it.
The error indicating that the sitemap could not be read typically appears in webmaster tools or in server logs. It signifies that search engines or tools attempting to access the sitemap file fail to retrieve or parse it successfully. The standard reasons include network issues, incorrect file locations, permissions problems, or formatting errors within the sitemap file itself.
One of the most frequent issues arises from an incorrect URL or path to the sitemap file. If the URL specified in the robots.txt or submitted directly to search engines is wrong or outdated, the crawler cannot locate the sitemap.
If the server hosting the sitemap is temporarily down, overloaded, or experiencing connectivity problems, search engines cannot fetch the file. Additionally, server configurations that block user agents or restrict access can lead to fetch failures.
File permissions may restrict access to the sitemap. If the permissions prevent web crawlers from reading the file or if security plugins or firewall rules block access, the sitemap fetch will fail.
The sitemap itself may contain syntax errors, invalid XML formatting, or unsupported tags. These issues prevent search engines from parsing it properly, leading to fetch errors.
Network issues, such as firewall rules or IP blocking, can inhibit access to the sitemap. Sometimes, hosting environments restrict external requests, causing fetch failures.
Ensure that the URL provided to search engines is accurate and accessible. Manually enter the URL in your browser; the sitemap should load without errors. Confirm it matches the location in your robots.txt file and in your webmaster tools accounts.
Use online tools or command-line utilities to verify server availability. For example, run a curl or wget command to fetch the sitemap URL. If the server returns an error or times out, troubleshoot server or hosting issues first.
Verify that the sitemap file permissions are set to allow public read access. Usually, permissions such as 644 are suitable. Ensure no security plugins or firewall rules are blocking access to the sitemap URL.
Use online XML validators or sitemap testing tools to check whether the sitemap conforms to XML standards. Correct any syntax errors, unsupported tags, or invalid characters.
Examine web server logs for fetch attempts and errors. Look for 404, 403, or 500 status codes related to the sitemap URL. This insight helps pinpoint server or permissions issues.
If the error persists despite troubleshooting, consider these additional steps:
Understanding and resolving the "sitemap could not be read" or "couldn't fetch" errors will enhance your website's SEO performance and ensure search engines can efficiently crawl your content. Regular monitoring and maintenance of your sitemap and server environment are essential practices for successful website management.
Sitemaps play a crucial role in the process of website indexing and ranking. They serve as a roadmap for search engines, providing information about the structure and content of a website. Properly configured sitemaps help search engines discover and crawl all relevant pages, ensuring your website appears accurately in search results.
When using search engine optimization (SEO) tools and webmaster tools such as Google Search Console, website owners often rely on sitemap files to facilitate efficient crawling. However, it is not uncommon to encounter issues where Google reports that the sitemap could not be read. Understanding the causes and solutions for this problem is essential for maintaining good search presence.
One of the primary reasons Google may not be able to read a sitemap is due to incorrect formatting. Sitemaps must adhere to the standard XML sitemap protocol. Any deviations—such as malformed XML, missing tags, or unsupported file types—can lead to errors.
If the sitemap file is not accessible publicly or is restricted by robots.txt rules, Googlebot will be unable to fetch it. Similarly, server errors, such as 404 or 500 status codes, prevent Google from retrieving the sitemap.
The sitemap URL must be correct and reachable. Sometimes, the sitemap link provided in Google Search Console points to a non-existent or moved location. Using HTTP instead of HTTPS, or outdated URLs, can also cause read errors.
Sitemaps exceeding 50,000 URLs or larger than 50MB may cause problems. While Google supports large sitemaps, improperly segmented or malformed large sitemaps might trigger read issues.
Incorrect encoding used in the sitemap file, such as special characters not properly encoded, can result in parsing errors preventing Google from reading the file.
Google Search Console is a vital tool for reviewing sitemap submission statuses. When Google reports that the sitemap could not be read, it provides specific error messages that can guide the troubleshooting process.
To diagnose:
Common messages include "URL not found," "Parsing error," or "Unsupported format," each indicating a different cause.
Use online XML validators or sitemap testing tools to ensure your sitemap is well-formed. Ensure all URLs are properly formatted, and there are no syntax errors.
Make sure the sitemap is hosted in a publicly accessible location. Check your server configuration and robots.txt file to ensure the sitemap is not blocked.
Update the sitemap URL in Google Search Console if it points to the wrong location or uses an outdated link. Use HTTPS if your website supports it.
If your sitemap is large, split it into multiple smaller sitemaps and create a sitemap index file referencing all of them. This approach helps Google crawl your content more efficiently.
Ensure that your sitemap uses UTF-8 encoding and that special characters are properly encoded to prevent parsing errors.
After addressing issues, resubmit your sitemap through Google Search Console. Monitor the status for any further errors and keep your sitemap updated regularly.
The XML sitemap protocol is a standardized format that search engines understand. It contains a list of URLs alongside optional metadata like last modification date, change frequency, and priority.
URLs in sitemaps should be absolute, canonical, and reflect the actual location of pages. Relative URLs are not recommended.
Ensure your sitemap does not conflict with robot directives that disallow or restrict crawling, which could prevent Google from reading your sitemap correctly.
Encountering a message that the sitemap could not be read by Google can be frustrating, but it often has straightforward causes and solutions. Proper validation, correct configuration, accessible hosting, and regular maintenance are key to ensuring that Google's crawlers can read your sitemap and index your website efficiently. By systematically diagnosing and fixing issues, you enhance your site's visibility and search performance.
A sitemap is a file that provides information about the pages, videos, and other files on a website, and the relationships between them. It helps search engines like Google discover, crawl, and index website content more efficiently. Proper sitemap submission improves site visibility on search results. However, website owners often encounter issues such as the error message: "sitemap could not be read" in Google Search Console.
The most frequent reason for this error is an incorrect URL. If the sitemap URL is mistyped or points to the wrong location, Google Search Console cannot access it.
When the sitemap file is missing, deleted, or moved to a different directory without updating the URL, Google Search Console cannot read the file.
Invalid XML syntax, unsupported formats, or corrupt files can prevent Google from parsing the sitemap successfully.
Problems such as server downtime, misconfigured servers, or restrictive firewall rules may block access to the sitemap file, resulting in read errors.
If a sitemap exceeds the size limits set by Google or contains too many URLs, it might cause reading issues. Google recommends a maximum of 50,000 URLs and a file size of 50MB.
If the URL is incorrect, update it in Google Search Console with the accurate link. Ensure the URL is accessible publicly without restrictions.
Adjust server permissions to make the sitemap publicly accessible. Avoid restrictive robots.txt rules that block Google's crawler from accessing the sitemap.
Use XML validators to identify syntax errors. Correct any issues and re-upload the sitemap. If the file is too large, split it into multiple smaller sitemaps and create a sitemap index file referencing them.
Configure the server to support GET requests, ensure SSL certificates are valid if using HTTPS, and prevent firewalls from blocking crawler access.
Ensure that the sitemap complies with Google's specifications regarding format, size, and URL compliance. Use Google's Sitemap Generator tools if necessary.
Implementing systematic updates, validations, and server optimizations ensures that sitemap issues are minimized. Always monitor the status reports in Google Search Console and address warnings promptly to maintain effective site indexing.
For websites with frequently changing content or extensive catalogs, dynamic sitemap creation and management become crucial. Use automation tools and sitemap generator plugins compatible with your CMS to keep sitemaps updated and conformant with search engine guidelines.
If you are managing an online store on Shopify and encounter the message "sitemap could not be read," it can be frustrating and confusing. The sitemap is a vital part of your website's SEO strategy, helping search engines crawl and index your pages efficiently. When Shopify cannot read your sitemap, it may lead to indexing issues, affecting your store's visibility on search engine results pages.
Typically, Shopify generates a default sitemap file located at /sitemap.xml. However, various factors can cause this file to become inaccessible or unreadable. Understanding these causes is the first step towards resolving the problem effectively.
If you or a third-party app modify the sitemap URL or submit an incorrect sitemap link to search engines, Shopify might not be able to locate or read the sitemap file. Double-check that the sitemap URL is correct and points to https://yourstore.myshopify.com/sitemap.xml.
While Shopify manages hosting, occasional server-related issues or downtime might temporarily prevent access to your sitemap. Monitor Shopify's status updates and ensure there are no ongoing outages.
The default sitemap should always be present. If a custom sitemap is used or an app modifies it, the file might be missing or corrupted, leading to read errors.
Some third-party apps or custom code snippets can interfere with sitemap accessibility. They may block the sitemap URL or generate invalid sitemap files.
If you submitted an outdated or incorrect sitemap URL via search engine tools like Google Search Console, it could result in read errors on their end.
Ensure that the URL https://yourstore.myshopify.com/sitemap.xml is accessible in your browser. If you receive a 404 or other error, the sitemap is not available, and further investigation is needed.
Use online tools or browsers to verify if the sitemap file loads correctly. Confirm that there are no restrictions, such as IP blocks or firewall rules, preventing access.
Look into recent changes made via apps or custom code that might impact the sitemap. Disable or remove conflicting integrations and regenerate the sitemap if necessary.
Log into Google Search Console or Bing Webmaster Tools and ensure the correct sitemap URL is submitted. Remove any outdated submissions.
If the problem persists despite troubleshooting, reach out to Shopify's support team. They can check for backend issues or server anomalies that may cause read failures.
To assist in diagnosing sitemap issues, consider using tools such as:
For more technical insights, refer to Shopify’s official documentation regarding sitemap generation and SEO best practices.
Proactively managing your Shopify store’s SEO files can prevent future problems. Schedule regular checks of your sitemap, especially after theme updates, app installations, or custom code modifications. Maintain communication with your developers or SEO experts to ensure your site structure remains optimized and accessible.
Monitoring the health of your sitemap is essential for maintaining good SEO standing. A healthy, accessible sitemap ensures that search engines can crawl and index your entire site efficiently, contributing to better search rankings and increased visibility.
The "Sitemap could not be read" error prevents Google from properly crawling your website
When Google displays this error in Search Console, it means their crawlers failed to process your sitemap file. This critical issue prevents Google from discovering and indexing your pages, effectively making your content invisible in search results.
This problem is especially common for large websites with thousands or millions of URLs, where Google's crawl budget becomes exhausted before processing your entire sitemap.
Understanding the root causes helps you find the right solution
Too many URLs for Google to process efficiently within their crawl limits
GoogleBot has quotas per site that can be quickly exhausted on large sites
Timeouts, slow responses, or server errors when Google tries to fetch your sitemap
Invalid XML structure, encoding issues, or exceeding the 50MB/50K URL limits
Save your time by avoiding these common dead ends
Google rarely provides direct support for individual sitemap issues, and responses can take weeks with no resolution.
Hours spent on Stack Overflow, Reddit, and SEO forums yield outdated advice that rarely addresses the core problem.
Unlike temporary crawl errors, this issue persists indefinitely without intervention.
Many indexing services make promises but lack the technology to actually solve crawl budget limitations.
IndexJump uses advanced technology to ensure your pages get indexed
A simple four-step process to get your pages indexed
Add your sitemap URL to IndexJump's dashboard
Our system processes your URLs efficiently
Pages get discovered and indexed by Google
Track your indexing progress in real-time
Real results from real website owners
"After months of struggling with sitemap errors, IndexJump fixed everything in days. Our traffic increased by 40%."
"We have over 2 million pages. No other service could handle it. IndexJump did."
"The free trial convinced me. I could see GoogleBot hitting our pages in real-time."
Everything you need to know about fixing sitemap errors
This error occurs when Google's crawlers cannot properly read or process your sitemap file. Common causes include large website size exceeding crawl budget, server timeouts during sitemap fetching, XML formatting issues, or the sitemap exceeding Google's 50MB/50,000 URL limits.
Google prioritizes crawl resources across billions of websites. Individual site issues, especially for non-enterprise sites, don't receive dedicated support. Their automated systems have fixed budgets per domain, and there's no manual override available through standard support channels.
IndexJump uses advanced technology to ensure your URLs are properly discovered and indexed by Google, bypassing the limitations of traditional sitemap submission. Our system strategically presents your URLs to Google's crawlers in a way that maximizes indexing success.
Yes! IndexJump offers a free trial for up to 100 URLs. This allows you to verify that the technology works by checking your server logs for GoogleBot visits. No credit card is required to start the trial.
Most users see GoogleBot activity in their server logs within 24-48 hours. Full indexing typically occurs within 1-2 weeks, depending on the number of URLs and your site's existing authority. You can monitor progress in real-time through the IndexJump dashboard.
Start with a free trial - no credit card required