Featured

Some back story and Care Plans

The Short Backstory…

from DOS to (An)Droid

I’ve been plugging away at computers since Microsoft introduced MS-DOS.  During the DOS days I worked through technologies supporting Novell networks and Paradox Databases.  Along the way,  Windows 3.11 came on the scene and killed Paradox.  Paradox for Windows was very slow.  MS-Access had matured enough.  Made the switch.

 

Software and coffee…

A new gig with a crazy programmer from Quebec.  I learned that F***-All was a useful expression when dealing with ObjectStar, especially after several Quad Espressos.

Thankfully the gig was translating ObjectStar into DTS packages.  The transition to Microsoft Server, SQL Server and DTS was a marked improvement and called for more Espressos.

After Y2K,

I thought my COBOL days were done and I could focus on moving MS Access databases to SQL, along with VB and VBA. When the choice to move to .NET presented itself, VB was traded in for C#, HTML and JavaScript.

 

DOT.Bust happened.

Alcatel decided they didn’t need any more fiber optic cable from Oregon.  Great job, great boss then dotcom went bust.

The core of the job  was a deep dive into performance enhancing large stored procedures handling large amounts of data.  Submerging fiber optic cables required mapping the construction of the cable.

The make up, where the outer armor changed or splices occurred had to be precisely recorded.  This data was generated by several machines within the assembly line.  The records had to be accurate.

The data was imported into several SQL databases through DTS packages.  The data had to be processed and staged into the reporting database.  While not quite a data warehouse, it was about 3/4s of the way there.

If an error occurred in the circuit, the ship going out to repair it needed the accurate information.

Most of the improvements, code wise, came from reducing and de-cursor-fying lengthy stored procedures.  The procedures tended to fill the transaction log.  The enhancement improved performance and transaction log management.

DB2 to Data warehouse in three(easy?) steps

The dot-com bust, left the employment landscape fairly barren for new web technology.  COBOL came back.

The web was dead, long live the web

Business stopped focusing on the web front and some looked to modernizing their DB2 data into SQL databases.  While the web market may have taken a hit, the supporting technologies were being deployed and data warehouse technology was rapidly improving.

A medical company needed to convert COBOL reports against DB2 to Crystal Reports and SQL Server.  It involved ETL from DB2 to SQL, stored procedure development and designing Crystal Reports with in-line SQL

COBOL formed the bridge into building a prototype data warehouse using the “new” Microsoft server technologies, SSRS, SSIS and SSAS for the Oregon Department of Transportation.

Who would have guessed that using Lotus123 to generate business reports lead to a data warehouse.  Along the journey, the need to understand and use web technology increased.  Most of the projects had a web application component.  The new frameworks and responsive design have made the web interface more flexible than ever.  The border between application and website has blurred considerably.  More components…more devices.

The Web of Devices

The smart phone has brought both the web and instant communication to a huge audience.  For many, the only access to the web they have is through their phone or tablet.  This landscape continues to grow.   Your smart watch records your biometrics which talk to your phone.  Your phone then talks to the “application” in the cloud.  From work, you log into the secure web site.

Technology continues to change rapidly…

Perhaps your site uses a widget that allows the site to gather user data.  Besides needing to look good on a phone or a monitor, it includes external code.  Managing websites means looking after all the parts and pieces.  Technology continues to change rapidly.  This pace also drives the pace of updates to the components that make up your site.

That requires a care plan…

What better way to go through a care plan than to take a basic site and add all the pieces that create a solid framework to support your site.  A large portion of Websites are powered by WordPress.  That was the choice for this site.

Setting up the site

Behind the scenes, this site is a WordPress installation hosted on Godaddy.com.  The following focuses on the individual site management.  WordPress sites are comprised of retail and custom themes.  Each theme can employ any number of widgets.  The point is, all of these “parts” need to be maintained and backed up.

Step One – Create and update(repeat weekly)

WordPress publishes updates frequently.  The following link provides the latest  WordPress notifications of releases.

Administrators Desktop WordPress Update Status
Administrators Desktop WordPress Update Status

This screen will be visited a lot throughout the life of a site.  Staying current keeps your site more secure.

Next…Plugins

Plugins add features.  That is one of the attractions of choosing a framework like WordPress.  The selected theme came with plugins already attached and set to inactive.

WordPress default plugins
Default Plugins

After reviewing each, the initial choice was:

Askimet and Limit Login Attempts
Akismet and Limit Login Attempts

Anti-Spam was a given.  Hello Dolly needs more review and limiting login attempts is a step towards hardening the site.

Activate selected Apply button
Top or Bottom

Because there are only a few plugins so far, you get two “Apply” buttons.  Both do the same thing but convenient when the list gets long.

configure askiment
Set up Anti-Spam account

Once the plugins have been applied, they need to be set up.  These steps will also become part of the Care Plan.  Each plugin may have additional requirements or provide important information that needs to be reviewed.

Attaching Askimet to the site
Attaching Akismet to the site

This installs plugin code into your site.  This code will need to be updated on a regular basis.  How often?  We’ll find out.

API Key
API key assigned

Once installed, you are issued an API key.  This is a key that is a secret key for your site.  Anytime you are presented with credentials, even if it is for a service you may only log into once, I have found it best practices to store a copy of the credentials in a secure credential store that is backed up.  There are many products that provide this.  I use KeePass2.

KeePass 2 UI
KeePass 2 UI

I set up a folder to hold the entries for the site.  Now that I have the API key, I set up a new entry to hold the key.

New Entry
New Entry
  1. The Title – clearly identify the service
  2. User Name – use it to further identify the service
  3. Password – store
    Show Password
    Show Password

    the passkey from the Akismet site:

  4. Click on Show on the Password field, paste the new API Key and save the entry.
  5. Using the API key, log into Akismet and update your settings.
    Akismet Final
    Akismet Final

     

    More plugins and settings to come. In Part II.

 

Data Collection Part Deux

More of the back story – technology

SQL Server, Windows Server, AS-400s, DB2, Desktops, Oracle and Teredata with others I would have to look at archived resumes.  The security of the data was always critical in all of these various systems.  Data portability is problematic when the data being ported is unencrypted. WebMD employs a high level of analysis on making sure the data output is full de-identified.  HIPAA regulations require this analysis as well as on going HIPAA training.

data breach causes 2015
2015 Data breaches from the HIPAA Journal

From the operations perspective, there are technical layers employed to secure information internally.  As with any structure, any opening presents a weak point in the wall.  Often overlooked is an additional door the technology used to make the openings creates.  A fancy way of saying backdoors into security.  Hackers attack both.  Brute force focused on the front door and direct attack at the security layer.

Prior to the additional security the cloud now provides, one project required a MS SQL Server to be placed outside in the DMZ.  This was a new requirement placed on the company due to a change in regulations.  At the time, I was the programming manager, SharePoint Administrator and DBA.  Small companies, you wear a few hats.

Well the server was placed in the DMZ, secure encrypted connection on a VPN, custom ports, and private identity.  Using the best practices of the time, the server still took abuse.  As part of the disaster and recovery plan, monitoring the areas exposed was included.  Monitoring was setup with both weekly and threshold alarms.

The structure of the VPN required using an approved interface that ran on an Apache server running Java.  SQL 2012 was the state of the environment and 2014 was on the upgrade path.  The connect between the Java Interface and SQL was a dedicated user account through the VPN.  The process also involved a combination of Listeners scheduled through server jobs and SSIS packages that fired when content arrived.

The structure of the VPN exposed a vulnerability.  The SQL instance was required to expose a public login.  That port was a main point of attack.  It was interesting to review the security log, especially the login attempts.  The variety of user password combinations were impressive and at times, amusing.  Every pass or attack always started by hitting the manufacturer or provider default administrator access passwords.

Every device exposed to the net gets hit with default attacks.  I shudder to think about all of the personal wireless routers out there still with the default passwords.  Open to both data and bandwidth theft.

Securing Data before it leaves the barn…

We employ a variety of layers to the data before we send it through the pipe.  The data is encrypted, the connection is encrypted and routed through a VPN.  This sounds like a secure arrangement and it is.  Once the data is in the pipe, its out of reach.  The additional weak point in this scenario is the end point of the data.  The data sent and how it will be used puts the data in a less secure environment.

The endpoint is where the information gets used, input and processed.  It may start out within a secure application and then it gets downloaded as a CSV to be used in an Excel report.  As the data moves further from the original secure source it becomes more vulnerable and more likely to be mishandled or lost.  One copy?  That’s easy to track.  Multiple copies of multiple time frames?  Much harder.

So if the data gets out, can the impact be minimized before the information is even exposed?  Leaving behind a USB drive happens.  They are small and unlabeled.  Who needs a label?  You just plug it in right?  Leaving a report on a bench or a USB at a convention.  Data gets exposed.  Protect it before it gets out.

De-Identification of Data

This is the first area to inspect when releasing data.  What is the minimum information needed to satisfy the request?  If it is not needed, do not send it.  Too often the “client record” is exposed with all the fields.  What happens when the output requires specific census details?  This is where using de-identification plays and important role.  What about when the endpoint needs all the information?  Using de-identification and minimizing core data transmission also help with endpoint responsiveness.

The minimum information needed to satisfy the request

The business or need drives what data needs to be presented.  Here is a scenario that presents some of the points to consider:

You need to take home customer records to prepare a report. You have access to the following data structure:

Tables Person connected to ContactDetails on PersonId, AddressToContacts on AddressId
The customer record

These tables were chosen because they contained all the information needed.  The query for the report ended up as:

SQL query image
Sample Query

Pretty straightforward set of data.  The “PersonDetails” table was left out because that information was not needed.

top segment person data
Top segment, Person data

The first question to ask about this segment,  what do we need for the report?  Is the granular information like Name etc, needed or perhaps, would the PersonId be sufficient?  What is the big deal?  Name and address is in the public domain.  Why not include it anyway? In this data set, both address and person are in the public domain.    What isn’t in the public domain is how much rent the person is paying or that they are associated directly with the address.

Using the PersonId instead of the Census Details, preserves the anonymity of the relationship.  Basically De-Identification boils down to removing data points that can lead to discovery through reverse engineering.  A medical data report can end up being in violation of HIPAA using generic types like, medical condition, employee type, and location.  The seems ok.  No names are used.  It is until, there is only one employ of that type at a particular, small location.  By cross referencing the public company directory with location and employee type, you discover that persons condition.    The granularity of the data is inversely related to population density.

While De-Identifying data is used to obscure specific identifying markers, by limiting what data is sent down to the minimum needed also benefits performance.

Minimizing re-transmission of core data

Even in a secure private intranet, minimizing what data that gets sent, cached and used in subsequent transactions is important from a performance perspective especially in a wireless environment.  Wireless is a requirement today so that increases this as a priority.

It is a given that a certain set of data is needed to complete the transaction.  This data set may be relatively dense such as an application profile.  However more often the flow follows an initial large input followed by incremental updates that affect the state of the object.

Transactions impact the state of some objects continually and others little or not at all.  These slowing changing facts and dimensions should be cached when accessed the first time.  For example, an address detail record is unlikely to change during the session but it is referenced several times throughout the work flow.  Cached at the beginning of the work flow and released at the end.

What is needed between workflow steps?  This slowly changing information is needed for the workflow but is not impacted by it.  Unless the workflow step needs any of the details for the transaction, the only data point needed is the clustered index or surrogate key that points back to this detail.  Using data pointers instead of the full record reduces the surface area that can be exposed.

The minimized record results in a very small packet that improves overall responsiveness.  Large frequently accessed BLOB objects like images should all be cached at the beginning even if the page that will need it is steps into the process.  This keeps workflow performance responsive.  It is also key to effective database interaction.

Data…data, but who gets the tables?

I’ve worked my way through Develop and Operations throughout my career.  Developers want to touch everything and DBAs want nothing touched unless they do the touching.  So where there is conflicts, solutions emerge.  Today, it is an open market for choice.  The focus here is on a Relational Database Model and Object Oriented programming using SOLID.

The ability to extend feature support dynamically and add additional core objects were requirements. What is the minimum data necessary to represent the core object.  Of those data points, what items are static and which may change and how often.

Take associating a phone number with an individual.  Then add an email address and perhaps a physical address.

We know a person may have one or more phone numbers and email address addresses.  These items both change through updates, deletes and adds.  When you modify a database table to extend support for a feature, the effort is costly both in expense and time.  Mainframe systems used wide flat tables with fields for each attribute.  Relational systems remove the duplication.

Back to minimizing the structure.  It needs to be extensible and the core objects also need to be flexible to support new classifications.  Where else are the phone, email and address attributes used?  The address attribute could easily be associated with other core objects.  The address object gets centralized and mapping tables are created to record the relationship.  A mapping table using the clustered indexes of both tables provides an optimized filter to focus on the union between the record sets.

The final step is to define a dynamic object model to support the extensibility requirement.  We have to store specific data type values and metadata to identify what that value represents and its state.

Stepping through the underlying dynamic model.  Entities are composed of attributes storing data of a specific type.  These attributes may change or added to.  The stored information can be represented as virtual tables through views.

Storing object names (entities and fields) with additional metadata in the label table.  Each label stored is qualified by a label type.  In addition to name and description, label state replaces delete.  Once a record is instantiated, it can only be deactivated.  That state applies forward from the time it was changed.  This state change will be recorded in the transaction warehouse.  Attribute values that change throughout the life of the object also need to be reflected in the transaction warehouse.  The transactional storage model applies to all structures in this model.

The label type table plays a more significant role and begins to build up the model.  The type defines what type of data the label represents.  The type table supports hierarchical grouping.  This second piece is part of the organizational structure defining where the label fits in the data model.

These two elements are mapped using a mapping table as well.  Using the mapping model preserves extensibility. Next, the definition of the value the label represents is needed.  This is separate from how the value is stored.  The definition informs the request how the stored value should be parsed.  Additional filters for data integrity can be applied to the definition such as data ranges, value type, access restrictions and other aspects.  This helps enforce data integrity.

Finally the information is stored in a transaction log that reflects current state.  The transaction log stores a reference to the label, type and definition record.  It also contains fields that are buckets for specific value types.  All types used by application except for max fields.  Those were stored in a separate structure to keep the transaction log performant.

Using this structure, object views are created to represent the entities defined in the structure tables.  When new features or updates require changing the structure, it’s accomplished through adding or updating the records in the definition tables and adjusting or adding additional views.  All labels are exposed for easy translation to support regionalization.

Separating the application from the tables

Databases are built to store and maintain data.  The user interface is a query window with SQL script.  The application presents the data transformed for the work flow need and user requirements.  This transformation occurs at the database API layer comprised of stored procedures that support atomic transactions.  Optimally the process is designed to support asynchronous transactions.

At the interface layer the procedures can be single purpose or designed to support overloading.  Typically it is a mix where single purpose procedures are reserved for specific critical tasks.  Cached data is used to maintain any end point static detail.  The message packet is comprised of the key map that identifies the values representing the update.  Based on the key map submitted, the procedure processes the data based on the key signature.

The workflow determines if the application waits for a return message from the procedure indicating the transaction has been recorded.  This return message is typically the updated state of the information to support the workflows confirmation process or data required for the next step.  The parameters for the procedures along with their name and description support discovery.  Using the object definition model also carries down to the supporting procedures allow them to be generalized.

Now, the rest of the story…

We have devices that store gigabytes and terabytes of information.  We try to send large objects around with out worry.  Only to find that our email program rejected it.  Security doesn’t mean sacrificing performance.  Reducing your data footprint is a good thing.  The less you send the faster it goes.

Wireless is the way we mostly connect.  Wireless networks are more limited.  Sending less in each message and caching large objects keeps traffic off the network.  The data can be stored and retrieved in a de-identified format to support security and comply with regulations around PII and PMI.

 

 

EULAs, My Phone (iPhone, Android, and others) and Google

Layers upon layers, the data we share

We hear a lot about security this and security that.  Big companies letting loose big sets of our personal data.  By now, your data is likely on the dark web.  You might want to look into that.  New dark web protection companies are popping up.

We install special programs and use secure sites to protect our information.  When it comes to our phones, we give away the farm when it comes to personal data.

It’s more than that however.  Where you share your story is 90% of the battle to keep your private live private.

EULAs

I came up with the title for this post about a week ago.  Privacy, Personal Information and Personal Habits are all up for grabs.

What authorizes these applications to collect this information? EULAs or End User License Agreements are where it starts.

Service providers may have to give you the option to “allow” this collection.  It seems that often, opting out means…

<with a funny movie style Italian bad guy accent:  “you don’t get it, capiche?” >

Apps are asking for more data.  Fast Food, Insurance (Snapshot anyone?) and more are asking us to grant access to private data.  We already have OnStar, Google and others collecting our location data.  Automotive gadgets anyone?

Everybody is in on it.  The cellphone provider, internet provider, applications and searches engines.  All hungry for data.

The Data Collection story intro:

Data collection is running rampant.  Data by itself may have no value.  It may have cost associated with it.  Value is determined by the information the data provides.  You write down a list of numbers from 1 to 10.  You have a, wait for it, list of numbers.  Pick any number and you get, a number.  If you add a name next to each number.  Now that number is, a number, yes but also the record id of the name next to the number.  As the complexity of the dataset grows, the value of volume of information increases.

Minimizing Your Footprint

First Pay in cash…

…and leave your phone at home or take the battery out.  Maybe not your thing but it reduces your footprint on the grid.

We have all been using “applications” for a while now.  We are conditioned to “click yes” because if we object to the license, we do not get the use of the product.

Pay by phone?

With every transaction, your financial institution and others gather all those tidbits you clicked yes on…

  • Your GPS location
  • What establishment
  • Items purchased
  • How much was spent

…are all gathered up.  Don’t forget that if you used an application, you are providing additional details about yourself.

An example would be walking into a fast food restaurant and paying in cash.  The data collected?  Not much.

For the restaurant:

  • Walk-in customer
  • Meal #3 sold
  • Amount paid
  • Payment Type :Cash, Debit or Credit Card

Using a Card Adds:

  • Your name
  • Card number
  • Amount spent
  • Financial Institution
  • Approval Status

Finally the Financial Institution receives:

  • Amount Spent
  • Retail or Service Establishment.

Here, what data can be collected is controlled by your agreement with the financial institution.  Now, add in your phone.  Most of the installed applications are gathering data all the time.  For example, a competing fast food restaurant’s application is also installed on your phone.  Looking at the data collected by their application, they see that you are visiting their competitor.

How?  Allowing the application access to your GPS data may give you better deals and application location sensing.  This setting usually gives the user a better “in store” experience. A grocery store chain’s application maintains local store maps that are geo-integrated.  Look up a product and you know the aisle number and aisle location (odd, even, row and span).  In a hurry for dinner items at a store you don’t know?  The location assist is a useful feature.

From the business side, this is strategic data and it allows refines market targeting data to a finer grain.  From the healthcare side, that “health application” you installed likely had a request for location access.  Often allowing access adds a desirable feature for client access.

Companies are already using personal employee health records to improve employee health as part of an overall health and wellness strategy.  Usually, companies doing this provide company phones to their employees.  With the “Health Application” already installed with company settings.

Data collection is a large topic and will be the topic of future blogs.  Some simple strategies to reduce your footprint?

  • Don’t install the application

Be choosy.  Do you really need the latest iPay, QuickPay or Deal of the Day feature?

  • Only click on the “yes” for Allow Location when you are using the application and then click it off before closing it.

This helps and for many of the applications, it dramatically slows down the collection.  This adds additional steps.  Additional steps or one click convenience?  You decide.

Google and Facebook are the great collectors of data and they depend on it for billions in revenue.  Amazon and Microsoft fill out the field.  The current internet is analogous to the Net Work Television dominance and has significantly changed how we obtain services, products, discounts, and so forth.  That was the old guard.  Finally, add in all of the devices and their various providers.   Data, data everywhere.

What’s interesting about the new “information class” is that it is finally a two way data stream from company to customer.  This is the ultimate in marketing and we, the day to day users, are the product that gets sold.  To our benefit, we get greater convenience.  Groceries delivered to our door now complete with pre-measured ingredients and recipes.  We got the text  that the refrigerator had placed the refill order automatically.

In the first place, I really don’t want someone else picking out my food.  Yes, yes, hollow argument as the food has already been “handled” many times.  Yet, touching, looking and smelling the food I am going to eat is important to me.  I digress.  We give up data all the time in order to use the latest or get the best deal.  Sometimes that use is warranted.  On the whole, probably not.

EULAs impact you every time you say yes.  Your phone is an angel and demon on your shoulder.  The phone is the data collection device that those “yes” questions enabled.  Even a phone without a SIM card collects data.  That GPS that keeps you from being lost?  Tracks every step you take.  Hmmm…sounds like lyrics…

My Phone

I remember the days when my phone was in my home.  It had an answering machine so I wasn’t completely in the dark ages, but calls would come in, be recorded and I would hear about it when I got home.  My contacts knew to leave a priority to their message if it was important.  Many of the messages were handled later in face to face conversations.  You would get messages or calls at work.

Then came the beepers.  To some they were a blessing and to others a curse.  Beepers going off meant a “now” event.  The definition of the event depended on the person and the role the beeper played.

Mobile phones took the pagers place.  Now, cellphones and tablets often take the place of a personal computer.

Your phone is a data collector.  Where you have been, what you bought, and even your credit rating is constantly being collected.  This is on top of all of your personal information you’ve added to your phone.

Every application added increases your digital footprint

Pay by phone?

Talk to a device in your home?

Data, data and more data.  I get it, convenient, secure, and fast. Those parts are great.  It’s the other data that’s a problem.  The solution?  Pay cash.   What’s paid in cash stays in cash.

Your phone is an advanced computer.  Any IoT device you add is also a computer and data collector.  What you add is your choice.  Read the EULA at least once.  There may be some data ownership questions you might want to get legal advice.

Google, Facebook et al

Are the Network 24s of today (ref; Max Headroom).  Algorithms and market share are their goals.  Predictive analytics and AI are the tools.  Big data solved the collection problem.  End users tools were created to handle large sets of data to be analyzed.  The analysis lead to better filters and predictive analytics to turn the search bar into one stop shopping for anything.  Convenience.

Google and its like are cultivators of data.  We, as consumers are the products, providing a wealth of data.  Largely, without our knowledge because we already gave our consent.  Hey, we wanted that hamburger, fries and drink for three bucks.  Cut the line with curbside delivery.  Convenience.

You think Google is just a search engine.  It is so nice to just enter “sour dough recipes” and get the results we want.  In that case, likely, you will.  That is what we want Google to do.  With all the technology underneath the data, data is filtered out.

It is a wrap

A brief dive into areas I feel that are important to explore and understand.  As consumers of these services we constantly share our personal information.  Often information we are not aware of.  Something as simple as your fast food choice and pattern. This may one day impact your health insurance.  I can see it now,  HealthCo:<insurance company>ProDiet… “save on health insurance today.  Connects to most grocery and restaurant applications.  Don’t forget to take advantage of our ‘All Organic‘ and ‘Health Diet‘ discounts.  Restrictions apply.  Check your policy for details.”

The barrage of requests has desensitized the us to the importance of the agreements we sign.  When a computer program is installed, you sign or accept the license agreement.  Did you read it?  Websites also function under a similar model.  The applications on your phone?  Same deal.  Usually these agreements are long and legal.  The have to be.  You are releasing information that is your right to keep private.  By controlling what information you do and don’t release allows you to manage your internet profile.

Azure Touch and Windows 10

A new paradigm, yes? nO? Well, that depends on your point of view…this is mine.

My older PC is giving up the ghost.  Sure, it does still provide value but reliability is key in what I do.  What do I do? I design software from an object approach.  So, a reliable computer is key…and the inspiration for this blog.

So, what could a new

Computer do for me?

Well, I hear that Windows 10 is tuned for the cloud.

My company’s product is tuned for the cloud.

Guess the Azure interface is the next step…so I’ve been ploughing through with older versions of Windows.  Granted, it’s 8.1 but 10 is a new paradigm.  Touch, keys…ummm

for those  of us keyed into ctrl-key combos…MS..Please Keep ’em around.

Onward….

Ok, using the Azure portal, Data Factories and all, I have a rather large table structure to propagate into my azure db using a daily incremental change using data factories.

Designed to do it…

Not a problem…yes???

Herein lies the dilemma…

touch screen, mouse, all painful

If you are accurate, with your finger, you have a chance of toggling a check box to checked, if you hit it just right.

Screens, accuracy, all play a role.

perhaps a less restrictive response on a check box where the default is, when selected, then checked = true.  Having that in place would save me hours of effort.

So there you go…

Ok update…

ok, if you are accurate with mouse placement, the dbl click on the pad, works as expected.

 

 

 

Databases everywhere…

In a galaxy far, far, away there existed a very unique application with a very unique database.  I do feel that the original incarnation of the application should be at the minimum in a reference book and perhaps, the Smithsonian.  The design came from a time when architects were not happy with what Microsoft had to offer and tried to do their own thing.  On a side track, was the push towards model driven development and modeling business processes into applications.

Heh, I got to play in all of those environments and gathered a bit o’ knowledge from the experience.  So, I had an antique application that enhancing wasn’t in the picture.  It was based on a “bound” technology and the only resolution was a complete re-write.  I can hear the collective groans over the time this was going to take.   Four years…that was the resource cost.  Why?

Knowledge about the industry:  Medication management is a very important and critical process.  It must be exact.  That is the first criteria.  Now let us add in the rest:

  • Regulatory
  • Compensation
  • Changing fact and measurement dimensions
  • Volatile environment

Regulatory:

This is a known, known in the design wheel.  Our product must comply with any regulatory requirement from CMS, DEA and even state and local governments and agencies.  So we have to have a dynamic security profile that can be applied in any connection consideration as well and data I/O.

 

Compensation:

Here we have a changing environment where data analytics will play a key role for current and future compensation.  Our data structure is designed around the object model that represents the business environment.  There have been historical requests that I have been unable to respond to because of the legacy database.  One of my main goals was to ensure all data entered was structured so that it was suitable for a data warehouse and analytical treatment.   Ah, the joys of writing from the ground up.

Changing Fact and Measurement Dimensions

One of the challenges we had to consider was the constantly changing dynamics of the medical reporting needs.  CMS and others, are pushing for quality care.  That has always been a major consideration in design.  The biggest question to answer?  What design will allow easy extension on the focused area of change and can maintain a consistent database model?  That is always a key focus.  Database changes are extremely costly.  The underlying physical database has to be:

  1. Isolated from the application
  2. Only accessed from the logical layer

Because of this, structures to manage the dynamic changes in the business requirements were engineered into the design.  This allows us to add new  values and fields without have to modify the database.

Volatile Environment

The last consideration I had to take into account was the constant changing government requirements around healthcare.  As we all know, currently, this is still in a state of change and varying requirements.  So any system designed around this has to be flexible in adapting to new changes.  This is where “from the ground up” gives us a real advantage.   We were able to engineer around all of the current requirements, provide for future changes and base it all on what is needed.

Summary:

We have built an application that will manage all care administrations that a care facility will need but also an application for anyone who has need to report on health care management.  The scope of what we have to track has gone beyond “give 1 pill a day” to:

  • How is Mary Feeling Today?
  • Has Mary gone for a walk?
  • How much has Mary consumed in food weight?
  • How much has Mary’s excrement weighed?

As you can see from the sample questions, we can produce a lot of useful and interesting information.  That is just the tip of the iceberg.  There is a whole wealth of medication information that can be derived from the results of both the medication and vital measurements.

I am really excited that we are reaching the final evolution of the new application and I can convert my existing customers over.  We have implemented so many improvements and made sure the application has flexibility for future enhancements with a short turnaround.

Lastly:

I have learned more about health care management, resident care and ongoing needs than I ever knew existed.  We all will experience some form of elder care.  If anyone I know goes under care, I want them to be under our system because I know we will be monitoring all the key points and the resident and their family will have an open pipeline for secure communication.

More to come… 🙂

Heh, the true life of a programmer…

True life?  Can there be such a thing.. come on now, we’re

 

talking about stuff here… ok, new font…

 

There are so many fun fonts but I digress…

WOW, creating a video takes some serious processor…

Again, I digress…

<queue the “Final Countdown”/>

Where was I…

The dreaded Estimate…

This is one area where this is no right answer.  As professionals, we know that at the beginning of the project, there is no way in hell an accurate estimate is possible. 

Unknowns… at this point, the unknowns you don’t know about are going to come out and get you quick…

Just saying, you’re working on a masterpiece for a sector that has a lot of the Man involved.  I mean, Government would have worked but really, each entity is a force unto itself… “The Man” and I liked it better.

It really should be “The Entity described as the Entity”.

Ah, object oriented programming, I capiche!  Totally used wrong I know.  In fact, I think I’ll retract that.

 

Word just doesn’t hang with Italian slang… what gives… gimme back my font!

 

Ok, back to the font of choice…you know, I remember a time when you could simply change the body font of the document and be done.  Now, it’s hidden back stage in document properties.  Say that ten times fast… come on, you know, you wanna…

DIGRESSION CHECKER!!!!

Wow, did I ever jump right off that estimate question.  Hey, it’s a really tough question.  I mean, there is always the FANTASY that everything will just fall into place. 

No equipment delays

Top notch programmers

Happy customers

A solid and reliable version 1 that upgrading will be a piece of cake…

<queue the Eagles, “The Long Run”>

ugh…

Switched to the Judas Priest station…

Needed high energy…

Back to the life of a prog…

Or this is my take…

Look, when I get into the code… it’s all there is.  The problem…the solution…

It’s the solution that gives me the juice. 

(boring Verdana… but consistent…once upon a time, I went to a career … hmmm consult? anyway, they had some good advice…juice…what drives you to show up, succeed?)

<Queue Ozzy>

I do wish my video processor would finish.  It’s playing heck with my blue tooth speaker…but wait, there’s a recall…buy something trendy, it’s what you get.  It’s not made anywhere but shipped in pieces to the final assembly <The Final Countdown>sorry but that is another story.

All right, I give up… better get back to the topic.  You see, that’s where this has been.  It may seem scattered but when you focus solely on an object to the exclusion of all else, the else gets…dingy.

Yes, a small boat used to do tasks….for the BIG boat.  Unless, you’re the Tom Sawyer type and do your own thing.

Font dizzy yet?  Well, I’m not sure about the Judas Priest station… hmmmm.  KONGOS?

Ah, much better….

 

So finally, where does this all end up in the world of estimates.  That’s right, haven’t really told you a thing.  We know that for a project to have life, it must have a release point.  This is not a completion point.  Let me stress that.  As programmers, we know a lot of different ways to accomplish things.  Even more when it comes to appearance and form factor.  What we don’t know, unless some whiz kid has come up with a customer requirements mind reader with a future mod to adjust for customer feature drift.  Customers want to change things.  We, love to change things.  But next, in a funny thing happened on the way to the database…

 

Long time no write, time for some insights…

Been reading a lot of inspirational books lately and you can add to that Magazine articles that my best friend has been handing off.  Hey it’s a lot of good information.  Sometimes it’s a psychology today article.  Other times it’s a book about a startups and running one.   

shutterstock_100108292

I think the hardest job is to take a company from ground zero (or maybe ground negative) and turn the sucker around in a short time to catch the wave of opportunity that abounds in the industry you’re are targeting.

Have I lost you yet?  I often start in the middle of a subject then fill in the details. Winking smile

shutterstock_147905081

I think having watched “Pulp Fiction” to many times has impacted my ability to converse…

It’s all good.  Plus having a digression checker never hurt.  So please, just go with the flow, I’ll get it all covered by the end.

I love playing poker.  Sure, it’s gambling but it is also a game of skill and psychology.  You can play your cards or you can play the person.  If you play your cards, and do it well, you won’t be the Big winner but you should have a reasonable play.  If you want to win, and win big, you have to play the player.  You may have heard the saying that “any hand can win”.  It’s all about strategy.  Poker, Holdem included, is all about playing the table.  Controlling your responses is first.  Hey if you can’t minimize your ”tells” a better player will take your bank.  If you can utilize your “tells” to first establish a pattern then change it up, you stand a much better chance of cashing in.  The same thing applies to Business.  A great book on start up mentality is “All In Startup” by Diana Kander.  This book portrays a really good concept.  But it really does come down to being all in. 

shutterstock_146636861

Are you committed?  We all have outside commitments.  Are you someone who can jump into a startup situation and deal with the requirements?  There is a reason that most relationships don’t survive a startup.  One or the other has to give.  Can your significant other really deal with the fact that you’ll be working 18 hour days for a stretch?  What, 18 hour days? You must be mad!!!!  It’s a Mad Mad Mad world buddies… Winking smile

Ok, here’s the why…

You’ve got a hot idea

It’s a hot market

The window is open only for so long

How do you meet those requirements?

Well first, back to “All In Startup” and why you should read it.  It comes back to addressing that “hot idea”.  The bottom line is that you may feel you have the killer app but does it really address a potential customer’s Migraine.  “Migraine”?  Isn’t that just a headache?  No, it’s a headache of massive impact.  That is what you have to solve.  Sure you product may be the cat’s meow but if it doesn’t address anyone’s migraine it will not go very far.  Ok, you argue, hey, my app is hot and it will solve x.  Everyone hate’s dealing with x so my app is totally cool.  I’m not saying that there isn’t a market for x but that’s kinda like being the killer rock guitar player that is finally discovered in a back ally bar.  It happens but the odds are stacked against ya.  It’s much better to be a mediocre player with connections… but that is another story.  I’ve got them but this is not the place Smile.

shutterstock_84574990

Anyway, read the book.  It’s advice is beyond the scope here but it is well worth it.

Add in “Hold Me Tight” by Dr. Sue Johnson.  Now here you are going to ask me why is a relationship book important in a business context?  Everything is connected.  Any relationship has all sorts of aspect but the concept of establishing connections is key.  You have to connect.  This book will give you insight into the whole aspect of connections plus where you might be missing the boat both personally and professionally.

Have I wandered too far off topic?  I don’t think so.  These things I’ve talked about have played a key role in the whole startup experience.

You have to be prepared for both the stress, the ability to set your own tone and goals plus getting your team on board.

 

You are going to have players of every stripe and you need a mix.

 

What if you had a pro team of all the best stars?  Well, without the members that do the mundane and ordinary, your “stars” will find it challenging to achieve their high level of performance.  You need the routine, mundane, day to day operations covered before you let your “stars” loose on the world.

shutterstock_138408221

Holding your phone in speaker mode is not “Hands Free” and other fun stuff…

You’ve seem them right?

Drivers jetting down the road…

Holding a phone but not to their ear…

Talking away…

In English, as I understand it, hands free means…

Without Hands

Hmmmmm.

Is there any question?

Can there be any doubt?

I even overheard a conversation where…

Someone got a ticket…

Was all miffed for being cited…

A scary call if you got money at all…

We are all so wired in…

Even my bank…

Sometimes I might forget to send them a note…

I’m out of the country, yes, those charges are ok…

But sometimes…you get the call..

Maybe by shopping at some national retailer…

You know the one…

Next thing you know, you have a replacement card…

It’s in the mail…

Oh… Do they still accept cash?

They?  Why anyone… gas stations…

Grocery Stores…

The local establishment…

Fortunately… that’s still a yes…

Until Skynet strikes!

Your cash has been terminated…

Active Listening

shutterstock_135583442

It’s something we all should do but it seems to rarely happen.  What is it?  While I’m not endorsing this site, MindTools does have a pretty good description. Wikipedia also has a page on it (see Active Listening).  I did like the warning at the top of the page in Wikipedia that talks about weasel words.  Perhaps that falls in like with double speak.  Whatever the definition the point is, that active listening is rarely practiced.

 

When Speakingshutterstock_141613546

 

Ok, so now you know a little bit about active listening, heck maybe you already practice it in principal.  When I’m talking with someone, I go into the conversation expecting that they will not be an active listener.  Since I can’t make someone listen, I need to adjust my message so that the important parts come across.  This is especially challenging when the message is not going to be accepted easily by the listener.  For example, when giving constructive criticism to an employee, I expect them to be practicing Active Thinking.  Funny right?  The act of hearing only a few words of your topic then immediately form a rebuttal.  It’s during the rebuttal phase that practicing Active Listening becomes a challenge.  It’s during this process that I put things in black and white.  I know they are not going to listen fully.  So backing it up in writing reduces the chance of ambiguity.  Is this fool proof?  No, but I’ve found that it is significantly more successful than just “saying it”.

shutterstock_121786222Perfection vs Perception

 

Ok, perfection is a fictional concept that can never be obtained.  So why am I even bringing it up?  Key to my practice of Active Listening is to accept that I don’t have all the answers.  Also, my point of view is a matter of my perception.  I accept that the person already has a different perception and by listening to their contribution to the conversation helps me form a better idea of where they are coming from.  This is a process that will never be perfect.  My goal with every conversation is to gain a little more insight into their perception.  This helps improve future messages to achieve a higher chance of getting through.

 

Sound bites just a political thing?

 

Or maybe just the news?  Sound bites are used because they are effective.  Short simple statements with intentional pauses have been shown to increase comprehension.  I’ve used this technique when dealing with particularly difficult people.  Me?  I’m difficult?  To some people I’m sure I am.  My goal is to get my message across at the office.  It’s not a social hour.  When in a social situation, I‘m not worried about clear and complete understanding.  As long as a good time was had by all, I call that a winner.  By the way, I found that using sound bites for humor can be pretty fun too.

So while this short blog may not be overwhelmingly educational.  I enjoyed writing it and sharing some of my thoughts on Active Listening.  Writing this is my way to help me to remember to use Active Listening especially during stressful situations.