Auth Tokens



  • I am unable to log onto the public server from Chrome. When I try to login thru steam I get the following error:

    TypeError: Cannot read property '_id' of undefined at tokenAuthPromise.then (/opt/backend/src/app/auth.js:349:41) at _fulfilled (/opt/backend/node_modules/q/q.js:834:54) at self.promiseDispatch.done (/opt/backend/node_modules/q/q.js:863:30) at Promise.promise.promiseDispatch (/opt/backend/node_modules/q/q.js:796:13) at /opt/backend/node_modules/q/q.js:556:49 at runSingle (/opt/backend/node_modules/q/q.js:137:13) at flush (/opt/backend/node_modules/q/q.js:125:13) at _combinedTickCallback (internal/process/next_tick.js:131:7) at process._tickDomainCallback (internal/process/next_tick.js:218:9)

    👍


  • Will these be available on private servers?



  • I'm also a little concerned about the rate limiting as well. I play screeps by coding on the live server. I also use multiple tabs in Chrome, and only use the steam client for private servers. I feel like this will negatively impact my ability to play the game as I usually do. When opening up multiple tabs in Chrome, would each tab have its own limits, or would the limits be applicable globally?


  • Culture

    Yeah these rate limits are pretty, well, limiting 🙂

    The rate limit on reading segments will really hurt the screeps stats programs, which currently store stats one per tick. This will be even worse for people who are multiple shard. Even at three second ticks users are only going to be able to get less than a third of their statistics with this system. Combine if with the rate limiting on reading memory (only once per minute) and statistics programs are effectively dead.

    Other thoughts-

    • I think the rate limiting on uploading code should be 240 per day, rather than 10 per hour. This would result in the same effective rate limit but would allow people to handle debugging a lot easier. I imagine there will be a lot of salt if people upload a bug but can't work around it due to the upload limit.

    • A new endpoint that allowed us to pull multiple segments at once would alleviate a lot the pain for the stats programs. With this we could grab all the statistic segments in one go, making it so each stat read only cost 1 memory read, 1 segment read, and 1 console call regardless of how many ticks are being processed.

    • Even with the above I think the memory reading ratelimit should be doubled to 120 or 180. This would allow people to run multiple programs (screeps-stats, screeps-notify) without worrying about hitting the limit.

    • It would be nice if we could request an exemption, or at at least higher limits, for some third party tools. Specifically speaking I would like to request a higher limit for the League of Automated Nations website and account (which is only used for completely public information). Otherwise it's going to take a pretty massive rewrite (which I will not have time for in January due to work and travel) to get it to fit into the limits.



  • I dont think there should be a limit on uploading code, with that rule it feels like you are discouraging people from actually playing the game. It will make more pressure on people, having to making sure each code upload is valid. Also imagine if your empire is under attack and you keep tweaking your defenses, 10 uploads can come pretty soon and a message saying "please try again in 1 hour" would be really discouraging during such a high stress situation.


  • Culture

    I am also concerned about the rate limits, one of the biggest things I was looking forward to with auth tokens is being able to collect stats on behalf of ScreepsPlus users, removing the requirement for them to run their own agent. After looking at these rate limits, I'm not sure thats viable without spreading the requests over several IPs to counteract the rate limiting, which would be a headache to manage. Another impact is currently most users request stats every 15 seconds, these limits effectively reduce that to once per minute when pulling from Memory, making stats useless for monitoring anything other than long averages. I personally use the websocket and console to collect my stats, but a majority use Memory or segments AFAIK.

    EDIT: I also agree with @tedivm, a daily limit on uploads would be nice, I sometimes upload several times within a couple minutes when working out bugs, that would easily burn through my 10 uploads for the hour and make it hard to work on code.


  • Culture

    @ags131 I'm not seeing anything about the ratelimits being IP address based- I just assumed they were tied to the accounts themselves, so switching IP addresses wouldn't matter (nor would having multiple people on the same IP address).


  • Culture

    Are there any plans to add additional endpoints in to the token system? Specifically I think it would be useful to add the "my orders" and "wallet" endpoints to the system so that people can still collect stats about them but not have to give out a full access token.



  • I too am concerned about the rate limit of 10 per hour on uploading code. It's way too restrictive.

    Thinking back to when I started, I remember several days of furious coding in which I commited far more then 10 times per hour, after ditching the ingame editor. I believe it's the way most people learn a new language: change one line, commit, see if it works, rinse and repeat. it will also interfere with "printf" debugging.

    I understand the devs might want to prevent excessive use (such as an external bot continuously changing code), but that should be addressed differently (increase CPU cost of uploading, use more elaborate limits, or just ignore it for now and monitor + warn players that do that)



  • The per-day limits would help to mitigate the issue, if it still serves the same purpose.

    • 10 per hour -> 240 per day
    • 60 per hour -> 1440 per day

    To keep players from being locked out of using the game for a whole day because of abnormal use or some bug, perhaps a limited-use reset system would help.

    In addition to that, if it were possible to raise some of the limits that are easier to hit that would probably solve a lot of the issues at least with my project.

    I'd be interested to hear about the purpose for the limits, I had assumed it is so that players cannot bypass the CPU limitations with 3rd party tools.


  • Culture

    @vrs considering that uploading code also resets the global for the user there's already a pretty steep CPU penalty for uploading.


  • Dev Team

    I'd like to make a quick clarification here since it already causes a lot of confusion:

    Regular requests made by browser and Steam client don't involve Auth Tokens and thus are NOT rate limited at all. They will work as before, without any limits, including code uploads. The documentation article is updated to indicate that.

    I'll answer to other comments and suggestions on Monday. Rate limits values will be most probably changed, they are not final in any sense.

    👍

  • Dev Team

    @stevetrov

    I am unable to log onto the public server from Chrome. When I try to login thru steam I get the following error:

    TypeError: Cannot read property '_id' of undefined at tokenAuthPromise.then (/opt/backend/src/app/auth.js:349:41)

    Must be fixed now, please confirm.


  • Culture

    Any chance we can get an endpoint to query a tokens access? Being able to determine what a token can access would be helpful in cases for example, where a user selects an option to pull from segment 5, but has only granted access to segment 10.



  • @artch said in Auth Tokens:

    olve Auth Tokens and thus are NOT rate limited at all. They will work as before, without any limits, including code uploads. The documentation article is updated to indicate that. I'll answer to other comments and suggestions on Monday. Rate limits values will be most probably changed, they are not final in any sense.

    Will this replace basic access authentication as well come February when the auth tokens replace the current system?


  • Culture

    I have another request that I think will be super helpful. Right now the options are Full Access or the selection of various options. I think a Read Only option would be extremely useful, and since all of the write operations are POST requests it should be easy to define what is read only to just the GET requests.

    This would allow third party developers to build really informative applications. Pretty much all of the League stuff can be handled with a read only token (with the exception of populating the public segments, but that's just one of roughly three systems the League site uses). The Screeps Dashboard used by Quorum is also read only. The backup tool could also be setup with a read only key.



  • As someone who enjoys making tooling for the screeps ecosystem, I'm pretty excited about this new feature, but wanted to come in to express my concerns about the rate limits. Since you've already said that the values will likely be changed, I'll just ask one question: Why do the rate limits exist? Here are my thoughts on potential answers to this:

    The rate limits are intended to reduce demand on Screeps infrastructure.

    In this case the limits should very likely be set so high that only problematic scripts would ever trigger them. For example, requesting a memory segment (100 KB) from each shard (3) once per tick (~.3 Hz) would round out to about 100 KB/s of bandwidth. If supporting that is tenable, the limit should be .3Hz (or 1080 / hour).

    For an even more stark example, the code upload limit should likely be closer to 720 / hour or more, given that the "baseline" is users editing code in the online editor might save every 5 seconds during active development, and we know the infrastructure can support this.

    The rate limits are intended to increase the challenge of the game.

    This seems less likely to me, but if this is the case then browser and steam clients should be rate limited as well. If you don't rate limit that authentication mechanism, then the external tooling will just find ways to use it so it can bypass the rate limits. For example, instead of the tool saying "go here to get an API token", it would say "go here to log in, then run this user script to produce a cookie you can use to log in". It's also worth pointing out that the API method of accessing memory/market/map is emulatable using the console API.

    Regardless of the motivation for rate limiting, I'd like to request that a few specific ones be increased to specific values:

    • POST /api/user/code should have a rate limit of at least 12 / minute = 720 / hour. This lets you update code every 5 seconds, which I bet an active coder on the site would be updating at during active development.
    • GET /api/user/memory-segment should have a rate limit of at least 1080 / hour. This will allow a script to collect per-tick stats from each shard in realtime.

  • YP

    100kb/s would be 0.8 MBit/s per user (or 8.2 GB / day) just for stats... how do you think that would be tenable if you want to support that for every active user?

    For an even more stark example, the code upload limit should likely be closer to 720 / hour or more, given that the "baseline" is users editing code in the online editor might save every 5 seconds during active development, and we know the infrastructure can support this.

    I would really like to see someone coding for an hour with a average save frequency of 5 seconds 🙂 That's like saving and uploading code every second tick.

    If you don't rate limit that authentication mechanism, then the external tooling will just find ways to use it so it can bypass the rate limits.

    That would only work if your script solves captchas. And if you do that actively to circumvent limits set by the game I would expect your account to get banned.


  • Culture

    100kb/s would be 0.8 MBit/s per user (or 8.2 GB / day) just for stats... how do you think that would be tenable if you want to support that for every active user?

    That's definitely not the case- it's a worse case scenario that isn't likely. Assuming one segment per tick per shard, and three second ticks with a player spread across three shards, the segment would have to be completely full of completely random data to hit that target. If the data isn't random the compression used by the API would drop the number significantly.

    Even without compression the segments are not likely to be completely full- saving that much data (and thus paying for the JSON.stringifycall) would use up a lot of CPU so people have incentive to only store what they are using. I"m a fairly high GCL player who collects a lot of stats and my segments tend to average around 50kb for stats- which turns into 7kb when compressed (which I just tested using real statistics segments).


  • Dev Team

    Alright, now after reading some of the comments here, I'd like to make another clarification.

    Tokens' purpose is to regulate automated use of API endpoints. Automated means human-less here. Such use may involve automated stats gathering or some automated actions during long (more than an hour) sessions. This explains such low limits for some endpoints, since they are not supposed to be automated in general.

    However, if you use tokens in some third-party client or another software which involves human presence, then rate limiting shouldn't be the case at all, like in the official client. For that purpose we should probably develop a method to reset all tokens timers at any time in the official client. It would look like a "Reset" button in the "Auth Tokens" section with reCAPTCHA attached to it. If you (not your automated software) have faced some rate limit and it blocks you (not your automated software), then you can easily press that button and continue. We can even develop an UI-less page containing that reCAPTCHA that your client can embed in an <iframe> to handle this scenario easily.

    Now to specific questions.

    👍