In addition. I get critical errors in Yandex Webmaster indexing.
and it warns me that it produces a lot of duplicate content :/
Critical
Duplicate pages with GET parameters were found
Some pages with GET parameters in the URL duplicate the contents of other pages (without GET parameters). For example, https://example.com/tovary?from=mainpage duplicates https://example.com/tovary. Because both pages are crawled, it might take longer for information about important pages to be added to the search database. This may affect the site's search status.
See examples. If the search has duplicates due to GET parameters, we recommend using the Clean-param directive in robots.txt so that the robot ignores insignificant GET parameters and combines signals from identical pages on the main page. When the robot learns about the changes made, pages with insignificant GET parameters will disappear from the search.
You know the only thing I can come up with that kind of made sense that we're maybe really issue would be if perhaps on the settings within our dashboard instead of it having to use cell and every time it uses a trading slasher focus on either of this being a search profile or album maybe we just use a default of like saying anytime there's a forward slash after the main domain name has been entered it should look for a user profile maybe then the exception can be made where it would allow normal profiles to function sort of like a forum URL?