Extreme increase in CPU usage with the same script


  • Culture

    Something has changed today what's caused a massive CPU increase:In the last 2 restarts something broke:

     

    http://i.imgur.com/Ydy5Ue1.png

     the 2 red lines indicate something wrong with the server (flatlines), right after it the CPU bucket starts plummeting and my script starts to react accordingly.

     

    If you compare this to earlier:

    http://i.imgur.com/viqPwxr.png

     

     you can clearly see the difference.

     


  • Culture

    Not sure if it's related, but I've been getting some erratic CPU usage lately as well.

    * Checking CPU used at the beginning of my main loop usually reports < 0.3, or at least it used to. Now it is sometimes, but it's frequently around 7, and occasionally spikes up to 50+. I don't know of anything going on other than loading code during this time.

    * I run this block of code near the beginning of my tick:

    CreepHandler.prototype.clearCachedRoomRequirements = function() {
        Memory.cachedRoomRequirements = {};
        Memory.cachedMaxCreeps = {};
        Memory.cachedNextRoomNameAssignments = {};
    }

    This clears out a fairly consistent amount of data each tick, but it will take anywhere from 0.1 CPU to upward of 7 CPU to execute.

    I'm not nearly as fastidious on tracking my CPU usage as Dissi (Hello, world!), but it doesn't seem like I've made any changes that would impact this. I get 40 CPU per tick right now 🙂 These spikes contribute a lot to emptying my bucket.


  • Culture

    Yes, server restarts impact the bucket quite a bit:

    http://i.imgur.com/n3PW6I6.png

    And re-caching of scripts (a compile) can take some extra time as well (7 CPU here as well)

     

    But since 2 hours or so it hasn't normalized anymore:

    http://i.imgur.com/WcPiEDq.png

     

    Thus causing less-optimal running conditions. My script does adjust, but I think it's more than note-worthy


  • Culture

    Yep, my problems have nothing to do with what's going on server side. Just going to post the explanations here, since I contaminated Dissi's thread. Thanks to Dissi and anisoptera for their help.

    * First access to Memory object comes at a CPU cost to deserialize all the data stored in Memory. That's what I was experiencing.
    * Code loading time is substantially more right after a code change, and does take a few CPU periodically, apparently every 50 ticks or so.

    Neither of these have anything to do with Dissi's observation.