The Importance of Interpersonal Communication (re-posting)

(This is a re-post from my blog post for Sourcetoad back in March ’17.)

As we tend to spend 30% of our lives in close proximity to other human beings it is important that we attempt to keep not just our professional skills sharp but also the skill of communication. To be able to articulate what we mean to say clearly and holistically is a cornerstone of good teamwork. Without good communication no team can succeed.

When speaking with fellow team members many factors come into play. One the speaking sides word choice, tone of speech, body language, and even facial expression can influence the receiver’s perception of what is being stated. Let’s take the example phrase ‘I am doing great today.’ Without context the phrase is simply a statement; say this phrase a frown and it becomes a sarcastic statement. Say the same phrase slow tired voice and it could easily be considered irony. While these two examples are extremes they help to prove the point that it is not always what is being stated but how it is being stated.

Expounding on the `how` a given statement is positioned; let us imagine every day we arrive at our place of employment, every day you ask the same peer ‘how is your day going’, and every day that peer answered with ‘as good as it can’. While the statement itself is not very clear the tone of voice, the energy used to state the phrase, and the body language of the peer are all context clues as to the true meaning of the phrase.

As we get familiar with our peers we can gain insight into the person, mannerisms, typical behaviors, and average attitude. Thus when communication with our peers gaining insight into the meaning behind the phrase. However, this path of contextual knowledge and interpretation can take a large amount of both time and effort.

In my opinion needing to gain intimate knowledge of an individual’s particulars is, in general, a waste of effort. I am not saying do not get to know your peers, the opposite in fact. But when it comes to explaining abstract ideas, concepts outside of areas of expertise or even humor, being precise and clear about the stated subject matter eliminates misunderstandings. I like to belong I am not the only one who has made a statement only to be misinterpreted and it causing issues down the road. Being clear prevents repeated effort and the need to start over on whatever the given subject or discussion is.

In the end it is important to remember that when we communicate with the people we spend upwards of ⅓ of our daily lives with we need to try our best to be clear, precise, and as well mannered as possible.

Going in circles…

…ever try to write something when your brain is not firing on all 8? I recently had all of my wisdom teeth pulled during the same visit to the Dentist. During the recovery I tried to bang out some code Needless to say if you can/are not able to focus and concentrate; very little progress is made…

Days tip: Do not get overly dedicated to a task if you head is not in it game.

Yii2 AR SQL Verbs; Y-U-SO-WEIRD?

Yii2 being an active record style DB abstraction AND needing to support a wide range of database technologies facilitated creating the standard insert / select / update / delete functionality inside the Active Record model layer. While most of implementation makes sense some of it is not intuitive. Herein is a TL;DR of the 4 major commands and the *All() version if applicable.

 

select / selectAll (in Yii2 this is termed ‘find’)

\Class::find([{array of criteria}])->one(); OR \Class::findOne([{array of criteria}]);

\Class::find([{array of criteria}])->all(); OR \Class::findAll([{array of criteria}]);

 

insert / insertAll (termed ‘save’)

$model = new \Class({column values as array});

$model->save();

No saveAll() implementation by default

 

update / updateAll

$model = \Class::find({column values as array});

$model->setAttribute(s)({string or array of SQL SET keyed values})

$model->save();

 

Class::updateAll([{array of criteria}], {string to Key = Value pairs for SQL SET})

 

delete / deleteAll

\Class::delete({string of criteria})

\Class::deleteAll({string of criteria})

 

Bonus: Yii2 also has the ability to create database commands and bypass the Active Record abstraction altogether: Yii::app()->db->createCommand({string  of SQL command}).

 

Now, lets look at MySQL / Maria / MsSQL / Postgra default immplimentation of the same actions:

select/all

Select {string of keyed values} FROM {string of source} WHERE {string of keyed values}

 

insert/All

Insert Into {string of source} VALUES {string of keyed values}

 

update/All

Update {string of source} Set {string of keyed values}

 

delete/All

Delete From {string of source} Where  {string of keyed values}

 

See a pattern there? string source, key/value data sets. And minus Select all start with the data source followed by the command, the the key/value criteria.  And even select makes sense when you treat the field names as a string substitution for `select {needle(s)} from {source|`.

 

It is not that AR is bad, nor that Yii2 had a lack of effort. Trying to support the feature set of multiple database technologies, active record, createQuery, best practices, security, and community requests is daunting, hands down. Maybe I’ll write a package to normalize the base 4 verbs for MySQl/MariaDB…

 

 

Shameless Self Promotion: Presenting during Tampa Bay PHP’s May meetup!

Using Codeception for Acceptance testing

Tuesday, May 30, 2017, 6:30 PM

Sourcetoad’s new location
2901 W Busch Blvd #1018 Tampa, FL

9 Members Went

David Eddy will present “Using Codeception for Acceptance testing: the crash course.”

Check out this Meetup →

 

Update: Video is up on youtube at https://youtu.be/QXSP0bEpF4Y .

Always plan on something going wrong…

Anyone who has sat through the process managing a project know, whole heartedly as a fact, no matter how much something is planned and scheduled something else will always go wrong. Believing this as fact one should never, ever, plan a time line or effort estimation based around everything working as planned. Planned and working are so far removed from each other the relationship is vague at best. No matter how many times we build the same CRUD application or data pipeline, or CI/CD process; always leave yourself some buffer time to correct unforeseen issues.

As a general guideline the less I know about a requested change the higher my effort estimation; almost geometrically. As an example:

  • 80 hours to write a single data object CRUD UI
  • 40 hours to write a single data object CRUD UI  with defined fields & validation
  • 20 hours to write a single data object CRUD UI  with defined fields, validation, and user story documentation
  • 8 hours to write a single data object CRUD UI  with defined fields, validation, user story documentation, and visual mock ups of the expected UX

Now at some point I hit a floor where effectiveness just simply keep up with the continued reduction in effort. But at that low of an effort estimate we are basically wizards doing magic as far as non code people are concerned.

Some people will  responded with ‘but estimates that high scare away clients’; This is very true and factual. However, my response is this: do you really want to work with a client who has not clear answer or concepts of what the results of the effort should be? Think `moving goal posts` but not having a goal post to begin with.

TL;DR: Always buffer your estimates, even if you know everything that needs to be done.

Lets Encrypt on Amazon Linux,

 So after switching some domain names around I wanted to add a Lets Encrypt SSL cert. to the blog here. Simple enough right? Log into the box, follow the instructions (https://coderwall.com/p/e7gzbq/https-with-certbot-for-nginx-on-amazon-linux) and that should be it? Nope, as alway an error occurred, when running the

certbot-auto certonly --standalone -d davidjeddy.com

 command. Turns out Amazon linux does NOT add `/usr/local/bin` to the $PATH. So I instead moved the binary to `/usr/sbin` and all was well with the world.

Couple minutes later I'm in the nginx config adding the cert, a quick restart and away we went into the great beyond of encrypted awesomeness.

Why basic data structure knowledge is important.

Today while working through an issue ticket we came across a situation where console.log() was printing out a rounded variation of an integer. The server response payload was correct, the client was receiving the correct payload. But when the data was processed a rounded variation of the data point was being returned.

http://www.studytonight.com/data-structures/images/introduction-to-data-structures.gif

After a short amount of digging we found that my initial hunch was close: it is a character length limit of the language.  Basic data structure and computer science understanding is in fact useful to be familiar with.

References:

http://stackoverflow.com/questions/1379934/large-numbers-erroneously-rounded-in-javascript

http://www.ecma-international.org/ecma-262/5.1/#sec-8.5

Refactoring array index magic number with class constant.

During a code review my peer Michal Mazur turned me onto the follow example; I have to say I am really digging it: Array Index as constant .

In my specific case here is the usage:

if (!empty($paramData[19]) && is_string($paramData[19])) {
    $paramData = $paramData[19];
} elseif (empty($paramData[19]) && !empty(isset($paramData[17])) && isset($paramData[17])) {
    $paramData = $paramData[17];
}

…and the refactored logic:

if (!empty($rowData[$this::FULL_SOURCE]) && is_string($rowData[$this::FULL_SOURCE])) {
    $imageSource = $rowData[$this::FULL_SOURCE];
} elseif (empty($rowData[$this::FULL_SOURCE]) && !empty(isset($rowData[$this::SQUARE_SOURCE])) &&
    isset($rowData[$this::SQUARE_SOURCE])
) {
    $imageSource = $rowData[$this::SQUARE_SOURCE];
}

While a bit more verbose the logic is much easier to read using the contextually named constants.

PHP and large CSV’s…

After looking around a bit I have yet to find a way to read a specified line from a file without doing one of the following:

  1. fopen()
  2. looping every line until reaching the desired line

Desired functionality:

  1. fopen($file)
  2. imaginary function call `fgetcsvline($lineNumber)` would returns the contents of line $lineNumber

 

Anyone know a solution for this?

 

Update: Derp, how could I forget about ‘The League’? http://csv.thephpleague.com/

PHPStorm 2017.1 released…

I’m sure many of you know this already by JetBrains has released PHPStorm version 2017.1. The improvement list looks nise and Im eager to try ’em out.
Get he full details over at the office release page: https://blog.jetbrains.com/phpstorm/2017/03/phpstorm-2017-1-is-now-released/

Of special interest is the Codecetpion and PHPUnit 6 as I am a big personal supporter of automated testing process.