Always plan on something going wrong…

Anyone who has sat through the process managing a project know, whole heartedly as a fact, no matter how much something is planned and scheduled something else will always go wrong. Believing this as fact one should never, ever, plan a time line or effort estimation based around everything working as planned. Planned and working are so far removed from each other the relationship is vague at best. No matter how many times we build the same CRUD application or data pipeline, or CI/CD process; always leave yourself some buffer time to correct unforeseen issues.

As a general guideline the less I know about a requested change the higher my effort estimation; almost geometrically. As an example:

  • 80 hours to write a single data object CRUD UI
  • 40 hours to write a single data object CRUD UI  with defined fields & validation
  • 20 hours to write a single data object CRUD UI  with defined fields, validation, and user story documentation
  • 8 hours to write a single data object CRUD UI  with defined fields, validation, user story documentation, and visual mock ups of the expected UX

Now at some point I hit a floor where effectiveness just simply keep up with the continued reduction in effort. But at that low of an effort estimate we are basically wizards doing magic as far as non code people are concerned.

Some people will  responded with ‘but estimates that high scare away clients’; This is very true and factual. However, my response is this: do you really want to work with a client who has not clear answer or concepts of what the results of the effort should be? Think `moving goal posts` but not having a goal post to begin with.

TL;DR: 1) always buffer your estimates, even if you know everything that needs to be done.

Lets Encrypt on Amazon Linux,

 So after switching some domain names around I wanted to add a Lets Encrypt SSL cert. to the blog here. Simple enough right? Log into the box, follow the instructions (https://coderwall.com/p/e7gzbq/https-with-certbot-for-nginx-on-amazon-linux) and that should be it? Nope, as alway an error occurred, when running the

certbot-auto certonly --standalone -d davidjeddy.com

 command. Turns out Amazon linux does NOT add `/usr/local/bin` to the $PATH. So I instead moved the binary to `/usr/sbin` and all was well with the world.

Couple minutes later I'm in the nginx config adding the cert, a quick restart and away we went into the great beyond of encrypted awesomeness.

Why basic data structure knowledge is important.

Today while working through an issue ticket we came across a situation where console.log() was printing out a rounded variation of an integer. The server response payload was correct, the client was receiving the correct payload. But when the data was processed a rounded variation of the data point was being returned.

http://www.studytonight.com/data-structures/images/introduction-to-data-structures.gif

After a short amount of digging we found that my initial hunch was close: it is a character length limit of the language.  Basic data structure and computer science understanding is in fact useful to be familiar with.

References:

http://stackoverflow.com/questions/1379934/large-numbers-erroneously-rounded-in-javascript

http://www.ecma-international.org/ecma-262/5.1/#sec-8.5

Refactoring array index magic number with class constant.

During a code review my peer Michal Mazur turned me onto the follow example; I have to say I am really digging it: Array Index as constant .

In my specific case here is the usage:

if (!empty($paramData[19]) && is_string($paramData[19])) {
    $paramData = $paramData[19];
} elseif (empty($paramData[19]) && !empty(isset($paramData[17])) && isset($paramData[17])) {
    $paramData = $paramData[17];
}

…and the refactored logic:

if (!empty($rowData[$this::FULL_SOURCE]) && is_string($rowData[$this::FULL_SOURCE])) {
    $imageSource = $rowData[$this::FULL_SOURCE];
} elseif (empty($rowData[$this::FULL_SOURCE]) && !empty(isset($rowData[$this::SQUARE_SOURCE])) &&
    isset($rowData[$this::SQUARE_SOURCE])
) {
    $imageSource = $rowData[$this::SQUARE_SOURCE];
}

While a bit more verbose the logic is much easier to read using the contextually named constants.

PHP and large CSV’s…

After looking around a bit I have yet to find a way to read a specified line from a file without doing one of the following:

  1. fopen()
  2. looping every line until reaching the desired line

Desired functionality:

  1. fopen($file)
  2. imaginary function call `fgetcsvline($lineNumber)` would returns the contents of line $lineNumber

 

Anyone know a solution for this?

 

Update: Derp, how could I forget about ‘The League’? http://csv.thephpleague.com/

PHPStorm 2017.1 released…

I’m sure many of you know this already by JetBrains has released PHPStorm version 2017.1. The improvement list looks nise and Im eager to try ’em out.
Get he full details over at the office release page: https://blog.jetbrains.com/phpstorm/2017/03/phpstorm-2017-1-is-now-released/

Of special interest is the Codecetpion and PHPUnit 6 as I am a big personal supporter of automated testing process.

Quick little thing.

Was sad to see Yii2’s getOldAttributes() did not have to ability to limit based on a provided array; whereas getAttributes() does take an array to limit the returned attributes. So I whipped this up right quick:

/**
 * @param array $array
 *
 * @return array
 */
public function getOldAttributes($array = [])
{
    $returnData = parent::getOldAttributes();

    if (!empty($array)) {
        $returnData = array_intersect($array, $returnData);
    }

    return $returnData;
}

Simple little thing but super handy.

emoji as variable names…

…during a short discussion today with a peer we found this little tool: https://mothereff.in/js-variables . I was all like AHA! I have an idea! Sadly though it turns out `(╯°□°)╯︵ ┻━┻` is not of the Unicode 8 character set.
:sad-face:. Then we got on the it would cool conversation of having emoji as variable names.

Imagine opening PHPStorm, etc and see in birthday cake var names! So I mocked something up real quick to pass around the office, enjoy!