I'm attempting to retrieve a list of 300 sites under our tenant using the approach outlined below, as recommended in the official documentation:
var searchResults = new List<Dictionary<string, object>>();
var paging = true;
var startRow = 0;
const int ResultsPerPage = 50;
while (paging)
{
var searchOptions = new SearchOptions("WebTemplate:GROUP")
{
StartRow = startRow,
TrimDuplicates = false,
RowsPerPage = ResultsPerPage,
SelectProperties = new List<string>() { "Path", "Title" },
};
// Issue the search query
var searchResult = await context.Web.SearchAsync(searchOptions);
// Add the returned page of results to our search results list
searchResults.AddRange(searchResult.Rows);
// If we're not done yet update the start row and issue a query to retrieve the next page
if (searchResults.Count < searchResult.TotalRows)
{
startRow = searchResults.Count;
}
else
{
// We're done!
paging = false;
}
}
// Process the total search result set
return searchResults.ToDictionary(x => $"{x["Path"]}", x => $"{x["Title"]}");
However, I'm facing an issue where I don't receive all 300 sites in the response. Instead, I'm only getting around 250 sites.
Strangely, when I increase the ResultsPerPage count from 50 to 300, I receive all 300 sites accurately. Additionally, I'm encountering duplicate entries when the ResultsPerPage count is set to 50.
I suspect there might be something I'm missing in how pagination works with the search API. Can anyone provide insights or guidance on this issue?