• (nodebb)

    Just nest a simple if-statement (in both if and else) 40 times, and you're already there. I can easily see how that number can be correct.

  • (nodebb)

    836081572200 is probably a bug, but it isn't an obvious bug without some internal details of the analysis tool. In hex, it's 0xC2AA585968, which looks nothing like any of the common "uninitialised or deinitialised memory" patterns.

    But still, the fact that the tool allows the number of paths to go high enough to need 64-bit counters, rather than just throwing up its hands in horror at 4 gibipaths and giving up on the function, is a slight WTF in its own right.

  • Gavin (unregistered) in reply to Steve_The_Cynic

    The irony is that this would require a test, increasing the complexity of the analyser itself.

  • (nodebb)

    @Steve_The_Cynic ref

    But still, the fact that the tool allows the number of paths to go high enough to need 64-bit counters, rather than just throwing up its hands in horror at 4 gibipaths and giving up on the function, is a slight WTF in its own right.

    A good tool writer knows both their audience and the materials the tool will be used on. Seems to me in this case the problem isn't the tool, or the decisions that went into writing the tool, but rather the world's codebase. That's the WTF.

  • (nodebb) in reply to Domin Abbus

    Actually, 40 consecutive if ... else ... statements with independent conditions will get you there. They don't need to be nested.

  • TheCPUWizard (unregistered)

    I strongly prefer Cyclometric Complexity [number of decision points] vs. NPath.... A case statement with simple cases, can grow quite long without becoming problematic. With cyclometric complexity (done correctly, there are many tools that do it wrong) the "switch" counts as 1.

  • TheCPUWizard (unregistered)

    @Steve - lets look as assembly language... Jump to "contents of register" -- 64 bit register, and BOOM the number of possible paths is 4G.

  • (nodebb)

    I have seen the result of fingers in a saw (in this case, a table saw), and I wouldn't recommend using that as a metaphor.

  • Jason Stringify (unregistered)

    *its

  • (nodebb) in reply to WTFGuy

    @WTFGuy You're not wrong. (But I did say, "a slight WTF," not, "TRRWTF.")

  • Tinkle (unregistered)

    In a multi-thousand line method I could see that 836081572200 being correct, or even a truncated metric.

    N-Path complexity multiplies the number of branches of every branch operation.

    This metric might just be saying is this method is way too long. Unlikely though.

  • (nodebb) in reply to jeremypnet

    The prime factorization works out to 2^3 * 3 * 5 * 7 * 331 * 601411, so I figure either there are some dependencies between the branches, or there is indeed a bug in the tool.

  • Tinkle (unregistered) in reply to emurphy

    I am no expert on N-Path complexity, but here is my take on it:

    If there was a 4 way case statement and one of those cases had two consecutive if statements then there would be 7 paths. 3 from the cases with no if statements and 4 from the case with 2 if statements.

  • Loren Pechtel (unregistered)

    Given emurphy's factorization I think there must be a bug of some kind. I can see something with a n-path of 331. I've never counted but I've probably done it with dispatchers of external data. But how do you code anything with an n-path of a 6-digit prime??

    An idea on what might be going on here--you sometimes choose a large prime to minimize collisions with hashes and related data. I'm thinking that's the size of an array where you look at the linked list at RecordID % TableSize to find your item--and if said item contains a method that's being executed it might confuse the analyzer. Very low cost inserts and faster searches than binary.

    And put me in the camp that doesn't care about n-path, just cyclometric complexity.

  • Loren Pechtel (unregistered)

    And to take another swipe at what might be going on:

    We insert 331 fixed items into an array. An array of 601411 has been determined experimentally to cause no collisions. Lookup consists of seeing if the item in the array obtained by RecordID % TableSize is the desired record. Lookup in O(1) runtime. You could fill the rest with a never-matching item to skip the null check but that causes an additional memory read that's probably not worth it.

  • Vilx- (unregistered)

    A "SetStatus()" method several thousand lines long with complexity beyond Ludicrous? Yep, that sounds plausible. I bet "status" is the status of some kind of process, and there can be like a dozen different statuses, and there are spaghettified business rules about the transitions between them, with convoluted side effects, and it's all written in this one function.

  • Naomi (unregistered)

    This reminds me of a one-method horror story from my first job. The majority of the method was a while (true) loop, and the majority of the loop was a switch statement over a local holding an enum. Why? Because assigning to that label and breaking out of the switch allowed it to act like an improvised state machine. I'd have probably found it endearingly kludgy if it weren't over 3000 lines long with dozens of local variables and a half-dozen parameters, all of which were assigned to in dozens of places... and also recursive.

  • (nodebb)

    Could the number be a rounded value. E.g. if the complexity were being measured with a single-precision float?

    And if this is something central to a complex process, as Vilx- suggests, and like Naomi's state machine (and I've seen stuff like that hundreds of times, to varying degrees of readability), would it necessarily be made more readable or better designed to split that up into several methods? Without knowing what's happening in there, sure, there could be a number of places where you could split it up. But if they're all unrelated checks, and methodically laid out one after the other in any easy-to-read fashion, it may be better to leave it as is.

    Excessively nested logic that goes in and out of decision levels like an accordion should be fixed though.

  • Your Name (unregistered)

    These arbitrary measurements are the wrong focus. I don't care if a function is several hundred lines, the code inside must be clear and concise. I rather have real issues detected than some theoretical crap raised to "major" bugs. 10 IFs are NOT a major bug (looking at you Sonarqube, raising hell for spelling and "complexity" "errors", but silently letting an obvious SQL injection slip by)

  • Korrat (unregistered) in reply to TheCPUWizard

    Cyclomatic complexity is not simply the number of decision points, though. In the original definition, it depends on the number of edges and basic blocks in the control flow graph. So for a switch with 4 cases/arms, it would still be 5 (or even 6).

  • (nodebb)

    Ideally, small enough that you can count it on your fingers after attempting to drunkenly operate a bandsaw with no training.

    Instructions unclear, hands now have six fingers each.

  • (nodebb) in reply to emurphy

    The NPath Complexity can have additive components. E.g. consider an 'if' statement where one branch contains another 'if', you end up with 2 + 1 = 3.

Leave a comment on “NPath Complexity”

Log In or post as a guest

Replying to comment #:

« Return to Article