Are you ready for the GDPR?

By AdamAugust 2, 2017 at 11:59 AM

 

The European Union's General Data Protection Regulation (GDPR) goes into effect on May 25, 2018, are you ready for it?

The GDPR introduces new accountability obligations and stronger rights and restrictions on international data flow.   Any organization that handles data about European citizens, regardless of where that organization is located, is subject to the GDPR.  Which makes the GDPR really the first international standard on data protection.  

But what does that mean to you / your company?

At the simplest level, if you are found to be in violation of the GDPR, you could be fined up to €20 million (almost $30 million CDN) or 4% of "group annual global turnover", whichever is greater.  So gross violation (ie.  not reporting a data breach within 72 hours) could result in a $30,000,000 minimum fine.

What does the GDPR entail?

These are the points that you need to know:

  1. It Applies to All -- If you work with personal data of European citizens, it applies to you.  The GDPR has, however, broadened the definition of "personal information" to include "anything that can be used to identify an individual".  That includes genetic, mental, cultural, economic, and social information as well as the regular data (address, name, birthdate, etc).
  2. You must be able to prove consent -- It is not enough to simply state that you are collecting information, you need to make sure that you specifically ask for permission and that you track when and how that permission was given.  You must also use plain and simple language to explain how and where the information will be used.  Opt-outs are not allowed (it must all be opt-in).  People must be able to decline the collection of personal information and you may only decline service(s) if the data collection was integral to the service.
  3. You might need a Data Protection Officer (DPO) -- Public authorities that process personal information and companies whose core activities require "regular and systematic monitoring of data subjects on a large scale" or consist of "processing on a large scale of special categories of data", require a DPO.  In Europe alone, this means that almost 30,000 new DPOs will need to be hired/appointed in the next 2 years.  If you already have a CPO (Chief Privacy Officer), you simply need to make sure they are following the GDPR as well.
  4. You may need to perform Privacy Impact Assessments (PIA) -- If there are areas of your business that could expose personal information, you need to perform a PIA.  This could mean that a PIA is required before starting any new projects involving people's personal information.
  5. You must report data breaches within 72 hours -- If you are breached, you must notify your customers within 72 hours.  But to accomplish this, you first need to know you have been breached and many companies wouldn't know they have been as they don't have the right technology and processes in place to do that.
  6. People have the right to be forgotten -- This goes a step further under the GDPR, in that you are only allowed to hold data as long as you need it and cannot use it for purposes other than what it was collected for.  And if an individual asks to be removed from your system, you must do so.  And it must be as easy to remove the data as it was to consent to its collection (example, their account info page should have a delete account button).  If you wish to use collected data for a purpose other than what you collected it for, you must ask for fresh consent.
  7. If you are a data processor, you are also liable -- If your company processes data on behalf of another company (example, call centre doing market research for another company), you also fall under the GPDR and need to implement the same procedures.
  8. Requires privacy by design -- Software must ensure proper and complete erasure of data.  If you backup your DB, then erase personal data and then have to restore from a backup that contains that personal data, you have to then make sure that the personal data is again removed.  It is going to be tricky in cases like this and require careful planning of procedures and technology.
There is a lot that must be done to be ready for GDPR.  If you have more questions about your rights and obligations under the GPDR, you should consult your lawyer.  If you need help in implementing technological changes in regards to compliance, we would love the opportunity to help your company protect both yourself and the data of your customers.

Posted in:

Tags: , , ,



The importance of being Small (AKA bundling and minification)

By AdamJuly 14, 2017 at 2:50 PM

When developing a web application, one must think about optimising the user experience.  And one of the things one must think about is the number and size of the files being downloaded.  

If you have ever gone to a web page where they have forgotten to optimise the graphics and you have a 2MB background picture; it can be painfully slow.  But one thing that many people forgot about is that most browsers can only download a small number of files at once, so the more files you have, the longer it will take the site to load.

Here at Cognitive X Solutions, we use the Angular framework for developing web sites that approach the speed and ease of use of a desktop application, and we do it with Microsoft's Typescript, which allows us to manage our JavaScript code bases with a greater level of ease, all the while helping us catch errors before they make it into production.  But combined, we end up with sometimes 100s of Javascript files.  Which is simply not acceptable for developing fast user experiences.

So what we do is we bundle and minify our javascript.  Simply put, we combine all of those 100 files into 1 or 2 larger files (that's bundling) and then we strip out all comments, extra lines and spaces, and apply optimisations to the JavaScript itself so that the resulting file(s) are much smaller.  (It has the (un)fortunate side effect of making the JavaScript extremely hard to read).

Originally we were doing this with a technology called Browserify, which analyses the interdependencies of our JavaScript files and produces a single file output that has all dependencies satisfied in an optimised way.  We would then combine this technology with another one called Gulp that would allow us to automate this process (it would watch for file changes and rerun Browserify when they did).  Gulp also handled compiling our CSS and inlining our HTML templates into a single file (Angular works by using templates, which without this inlining process meant that Angular, even if we combined it all into a single JavaScript file, would still be making calls to the server to load all those HTML snippets.  With inlining, we load one large file that contains all the snippets by name.  This allows a single call to be made to retrieve all templates, greatly increasing the speed / responsiveness of our web applications).

More recently, a new technology has come on the scene called Webpack.  Webpack is the official bundler of Angular 2 (we currently still use Angular 1.6+ as it has more llibraries available for it, Angular 2 was a complete rewrite of the technology and component libraries are still catching up), as well as the React framework (a competitor to Angular).  On one of our current projects, I decided to give Webpack a try.  And I was quite impressed.  It allowed me to accomplish in a much more succinct way (and in some cases more powerful way), what I had done with Gulp + Browserify.  The end result was that I combined all our JavaScript, HTML, and CSS into a single Javascript file.  So with one file download, a whole web application would spring to life.  That is powerful.  We will be switching our development practices to use this new technology so that our clients websites will experience this performance and responsiveness.

If you are interested in more information about exactly how we are using Webpack, you can check out my personal blog here

Posted in: Programming | Open Source

Tags:



ASP.NET MVC 4 and Mocking ModelState

By AdamAugust 23, 2012 at 8:37 AM

Actually, mocking isn’t quite the word (but we’ll talk about that in a moment).

At Cognitive X, we specialize in web development using ASP.NET MVC (currently using V4), and we very much love Test Driven Development.  So as part of our development process with MVC, we write tests for our Controller Actions.  One of the problems that we run into with testing controllers outside of the MVC framework is that they are not fully setup and many pieces are missing that allow the controller to function properly.  The biggest one being ModelState.

For example, given the following piece of code, no matter what, it always returns true because the ModelState is actually empty, when run outside the normal MVC pipeline:

if (ModelState.IsValid)
{
    // Perform action
}

So inorder to properly test our controllers, we wrote a method that would take the model and setup ModelState correctly (including running all validations on the model).  It was quite easy with V3 of ASP.NET MVC to do this, but with MVC 4 and the introduction of so many new providers (ModelMetaDataProviders, ValueProviders, etc), the old way no longer worked. I spent a morning chasing ModelStateDictionaries through the MVC code (thank goodness for ILSpy, it made my life so much easier), trying to figure out where it actually got filled out.  What I eventually hit upon was the code hidden within the Controller.TryValidateModel, which did exactly what I wanted it to do. The only problem is that it’s protected internal and no good for a general support extension.  So I pulled the code out, rewrote it a bit, and here it is:

public static void SetupModel<T>(this Controller ctrl, T model)
        {
            if (ctrl.ControllerContext == null)
            {
                Mock<ControllerContext> ctx = new Mock<ControllerContext>();
                ctrl.ControllerContext = ctx.Object;
            }

            DataAnnotationsModelMetadataProvider provider = new DataAnnotationsModelMetadataProvider();

            var metadataForType = provider.GetMetadataForType(() => model, typeof(T));
            var temp = new ViewDataDictionary();

            foreach (var kv in new RouteValueDictionary(model))
            {
                temp.Add(kv.Key, kv.Value);
            }

            DefaultModelBinder binder = new DefaultModelBinder();

            ctrl.ViewData = new ViewDataDictionary(temp) { ModelMetadata = metadataForType, Model = model };

            foreach (ModelValidationResult current in ModelValidator.GetModelValidator(metadataForType, ctrl.ControllerContext).Validate(null))
            {
                ctrl.ViewData.ModelState.AddModelError(CreateSubPropertyName("", current.MemberName), current.Message);
            }
        }

So like I said, Mocking is not really the right word for it, it’s actually setting up ModelState to contain the right validation information / model errors, so that controller actions work as expected.

Posted in: Programming

Tags: