Informazioni personali

Cerca nel blog

Translate

lunedì 19 dicembre 2016

Season's Greetings

 

Wishing you and your beloved a joyous holiday season and a peaceful and prosperous New Year.

 

Best wishes

Antonio

 

venerdì 16 dicembre 2016

Missed the conference? Learn how to pay down your #cybersecurity debt @PrivacyProf, @AntonioIerano & @MagdaChelly http://video.dataprivacyasia.com

<blockquote class=”twitter-tweet” data-lang=”en”><p lang=”en” dir=”ltr”>Missed the conference? Learn how to pay down your <a href=”https://twitter.com/hashtag/cybersecurity?src=hash”>#cybersecurity</a> debt <a href=”https://twitter.com/PrivacyProf”>@PrivacyProf</a>, <a href=”https://twitter.com/Antonioierano”>@AntonioIerano</a> &amp; <a href=”https://twitter.com/MagdaCHELLY”>@MagdaChelly</a> <a href=”https://t.co/h1920sHGPF”>https://t.co/h1920sHGPF</a> <a href=”https://t.co/SqRdtrOjRR”>pic.twitter.com/SqRdtrOjRR</a></p>&mdash; Data Privacy Asia (@dataprivacyasia) <a href=”https://twitter.com/dataprivacyasia/status/809517859110080513″>December 15, 2016</a></blockquote>
<script async src=”//platform.twitter.com/widgets.js” charset=”utf-8″></script>

giovedì 15 dicembre 2016

Solution selling: are you sure you know what really means?

English: Torii Kiyohiro Three Street Vendors S...
English: Torii Kiyohiro Three Street Vendors Selling Goods for Autumn c. 1750 Publisher’s seal: Tōriabura-chō and yama maruko han (Maruya Kohei); Printed together for individual sale, 3 hosoban, 30.6 x 43.9 cm; benizuri-e Vendor of cricket cages (right); seller of autumn flowers (middle); lantern seller (left) (Photo credit: Wikipedia)

I hear a lot of people talking a lot about “solutions selling”, all keep telling us they are moving to that area, but do we really understand what “solution selling” means?

Solution selling does not mean we have to sell “a solution” in terms of a complicated architecture or a set of interconnected boxes or whatsoever.

Solution selling is a sales methodology. Rather than just promoting an existing product, the salesperson focuses on the customer’s pain(s) and addresses the issue with his or her offerings (product and services). The resolution of the pain is what is a “solution”.

What does it means moving from selling boxes to solutions?

In a “solution selling” approach is key to be able to understand customer pain points, and be able to relate those pain points to your offering.

This should be the common selling approach to be used in the IT market since 15 years (maybe more) and it should be common for 2 categories of vendors:

1) The big ones that want to scale and need recurring deals from their customer base

2) The small ones but with quality unique offerings, typically innovative startups.

That vendor that does not stay in these 2 definitions does not need, basically, a solution selling approach.

You do not need solution selling if you are a pure box seller.

The real difference between solution selling and box selling is the proactive approach that is required for the first selling methodology.

While a box seller can go from door to door offering its products, putting the minimum effort on convincing the customer and having a short immediate time view, the solution seller needs, first of all, to build a relationship in order to know its customer, and therefore the outlook can’t be immediate, but medium period, since creating a relationship require time.

So box selling advantage compared to solution selling is to have:

• Immediate revenue

• Minimum effort

But what is the solution selling advantage, if any? In short, the main advantages of a solution selling approach can be:

• Lower price pressure

• Recurring deals with the same customer

Lower price pressure:

The reason for the lower price pressure is mainly related to the fact that in a solution selling approach targeting the pain point raises the value of the products solutions services proposed. Even in our consolidated technology market.
Of course, lower price pressure means higher margins, so it is understandable why so many IT ICT vendors historically moved to solution selling.

Recurring Deals:

But better margins, per se, do not completely justify a solution selling approach. The aspect most interesting is the recurring deal possibility due to a better understanding of customer needs and relationships.

In the end, solution selling allows more healthy growth, better margins, and a better-used customer base.

But solution selling comes with a price; the biggest skill required is to be able to understand the customer.

“Solution selling” in the Enterprise Market.

If “solution selling” requires identifying customer pain(s), this means being able to understand the customer.
Understanding customers’ needs require a different approach, sounds silly but the first one is to be able to “listen to the customer and understand him-her”.

This requires being able to:

1) Understand the business issue

2) Being able to relate it to the technical aspect related to our offering.

The first point requires a business understanding that goes beyond the simple product. In order to solve a problem, you have to understand the problem. And to understand the problem you have to put yourself in your customer “shoes”.

The second point basically means to be able to have a technical approach that is not limited to the product specification, but how the product “lives” inside the customer environment.

Both points 1 and 2 require, usually, the involvement of 2 common sales roles:

The Customer Account Manager and the Pre-sales Engineer

Both roles are key in the solution selling approach because they are engines to understand the issue, translate it into a technical offering and communicate the value to the customer. While the first is usually the holder of the relationship and the economic interface the second is the “translator” from business need to products solution services offering.

Keeping the two roles separated is usually a good thing since a pre-sales engineer should not be seen as a “salesperson” in order to give her or him more technical credibility.

Things become more complicated, in an Enterprise environment, when we add to the equation the role of the channel.

How can, a vendor, add this value to its channel? Well basically this is done through 2 specific approaches:

Channel segmentation and channel education.

Since through channel the approach with the customer is not direct, what it is usually done is to provide to the channel shared resources that can fill the eventual gaps they have to implement a solution selling approach.

This is done, basically through channel specialization (vertical, product, certification), and channel support through sales specialists and presales channel engineers.

POC or why I should trust you?

We already understood that the solution selling approach requires a different attitude when approaching the customer, but a solution selling approach means also the customers will act in a different way with us. The most evident aspect of this change is the necessity of Proof of Concept or POC.

Basically, from the customer side, the point is:

“Ok, I am buying a value from you, which will solve my pain point. But I need to be sure because I need this pain to be withdrawn so please I need you to demonstrate that:

1) You can actually solve my pain point

2) You will not generate more problems with the introduction of your offering”

This means, basically, that we have to prove what we say is the truth, and usually this is done by example. This means:

1) References when available

2) Proof of Concept

Sometimes proof of concept is just a demo, sometimes is a test in a virtual environment, and sometimes (it happened to me in the past) is a test in a live and running production environment.

So we should be so brave to accept the challenge and proof our customer we are trustable and we can actually help her him out. If we don’t do we risk losing our credibility and losing the customer, at least for the value space selling.

From product marketing to solution marketing

One of the other consequences related to the solution selling approach is the need for a different marketing approach.
While selling isolated boxes can give the focus on the box itself even from a marketing perspective, a solution approach requires more of anything else to build company credibility. In other terms, if you want to offer a solution for a pain point, the customer needs to trust you in terms of:

You are able to understand the pain,

and

You are able to solve the pain.

Those two aspects are not strictly product-related; therefore it is necessary to change the communication approach, moving toward a more “institutional” one.

This communication needs to target 2 different audiences:

1) Potential customers

2) Partners resellers

This is why usually it is common to have 2 different but integrated communication plans.

Where are you on this journey?

If you are a box seller, no doubts, you have to start the ground to move from box to solution.

It is interesting to notice that the Solution selling approach is not mutually exclusive to the box seller one; they are just two aspects of the selling activity of an IT-ICT vendor.

Focus on vertical will require, sooner or later, to change the generalist approach used as box seller to a more targeted approach where you start focusing on qualified salespeople (with a deep understanding of specific verticals) and the introduction of a skilled pre-sales figure that is still missing in action if you look for inexperienced young and cheap rookies.

You will have, at the same time, yet a lot of things to do in terms of your MKTG approach and, I am afraid, in terms of people management.

But the good news is we have a lot of space for improvement.

happy selling

mercoledì 14 dicembre 2016

Firewall: Traditional, UTM and NGFW. Understanding the difference

Firewall: Traditional, UTM and NGFW. Understanding the difference

One of the problem nowadays when we talk about firewalls is to understand what actually a firewall is and what means the acronym that are used to define the different type of firewalls.
The common definition today recognizes 3 main types of firewalls:

• Firewalls
• UTM
• NGFW

But what are the differences (if any) between those things?
Let’s start with the very basic: what a firewall is.

Simulação da participação de um Firewall entre...
Simulação da participação de um Firewall entre uma LAN e uma WAN Français : Schéma d’un pare-feu entre un LAN et un WAN (Photo credit: Wikipedia)

Firewall:

A firewall is software used to maintain the security of a private network. Firewalls block unauthorized access to or from private networks and are often employed to prevent unauthorized Web users or illicit software from gaining access to private networks connected to the Internet. A firewall may be implemented using hardware, software, or a combination of both.
A firewall is recognized as the first line of defense in securing sensitive information. For better safety, the data can be encrypted.
Firewalls generally use two or more of the following methods:

Packet Filtering: Firewalls filter packets that attempt to enter or leave a network and either accept or reject them depending on the predefined set of filter rules.

• Application Gateway: The application gateway technique employs security methods applied to certain applications such as Telnet and File Transfer Protocol servers.

Circuit-Level Gateway: A circuit-level gateway applies these methods when a connection such as Transmission Control Protocol is established and packets start to move.

• Proxy Servers: Proxy servers can mask real network addresses and intercept every message that enters or leaves a network.

Stateful Inspection or Dynamic Packet Filtering: This method compares not just the header information, but also a packet’s most important inbound and outbound data parts. These are then compared to a trusted information database for characteristic matches. This determines whether the information is authorized to cross the firewall into the network.

The limit of the firewall itself is that works only on the protocol side (IPTCPUDP) without knowledge of higher level of risks that can cross the network.

From virus to content filtering there is a hundreds thousands different technologies that can complement firewall works in order to protect our resources.

To address the more complex security environment firewall evolved into something new, that cover different aspect above the simple protocol inspection. Those devices uses different technologies to address different aspect of security in one single box, the so called UTM (Unified Threat Management)

Unified Threat Management (UTM)

Unified threat management (UTM) refers to a specific kind of IT product that combines several key elements of network security to offer a comprehensive security package to buyers.

A unified threat management solution involves combining the utility of a firewall with other guards against unauthorized network traffic along with various filters and network maintenance tools, such as anti-virus programs.

The emergence of unified threat management is a relatively new phenomenon, because the various aspects that make up these products used to be sold separately. However, by selecting a UTM solution, businesses and organization can deal with just one vendor, which may be more efficient. Unified threat management solutions may also promote easier installation and updates for security systems, although others contend that a single point of access and security can be a liability in some cases.

UTM are gaining momentum but have, yet, a lack of understanding of the context and the users, therefore are not the best suit to address the new environments. In order to drive those gap security researchers moved onto upper layer and form protocol moved to applications, where user behavior and context are key.

This moved from UTM to the so called Next Generation Firewall or NGFW

next-generation firewall (NGFW)

A next-generation firewall (NGFW) is a hardware- or software-based network security system that is able to detect and block sophisticated attacks by enforcing security policies at the application level, as well as at the port and protocol level.
Next-generation firewalls integrate three key assets: enterprise firewall capabilities, an intrusion prevention system (IPS) and application control. Like the introduction of stateful inspection in first-generation firewalls, NGFWs bring additional context to the firewall’s decision-making process by providing it with the ability to understand the details of the Web application traffic passing through it and taking action to block traffic that might exploit vulnerabilities

Next-generation firewalls combine the capabilities of traditional firewalls — including packet filtering, network address translation (NAT), URL blocking and virtual private networks (VPNs) — with Quality of Service (QoS) functionality and features not traditionally found in firewall products.

These include intrusion prevention, SSL and SSH inspection, deep-packet inspection and reputation-based malware detection as well as application awareness. The application-specific capabilities are meant to thwart the growing number of application attacks taking place on layers 4-7 of the OSI network stack.

The simple definition of application control is the ability to detect an application based on the application’s content vs. the traditional layer 4 protocol. Since many application providers are moving to a Web-based delivery model, the ability to detect an application based on the content is important while working only at protocol level is almost worthless.

Yet in the market is still not easy to understand what an UTM is and what is a NGFW

UTM vs NGFW

Next-Generation Firewalls were defined by Gartner as a firewall with Application Control, User-Awareness and Intrusion Detection. So basically a NGFW is a firewall that move from creating rules based on IPport to a firewall that create its rules based on User, Application and other parameters.
The difference is, basically, the shift from the old TCPIP protocol model to a new UserApplicationContext one.
On the other end UTM are a mix of technologies that address different security aspect, from antivirus to content filtering, from web security to email security, all upon a firewall. Some of those technologies can require to be configured to recognize users while seldom deal with applications.
In the market the problem is that nowadays traditional firewall does not exist anymore, even in the area of personalhomesoho environment. Most of them are UTM based.

NGUTM

Quite most of the firewall vendors moves from old firewalls to either UTM or NGFW offering, in most of the case NGFW offer also UTM functions while most of the UTM added NGFW application control functions creating, de facto a new generation of product changing the landscape with the introduction of Next Generation UTM

UTM vendors and NGFW vendors keep fighting on what is the best solution in modern environment, but this is a marketing fight more than a technical sound discussion.

The real thing is that UTM and NGFW are becoming more and more the same thing.

NOTE it’s all about rules.

Why security devices become so comprehensive and try to unify such a lot of services? Management is the last piece of the puzzle. In two separate studies, one by Gartner and one by Verizon Data’s Risk Analysis team, it was shown that an overwhelmingly large percentage of security breaches were caused by simple configuration errors. Gartner says “More than 95% of firewall breaches are caused by firewall misconfigurations, not firewall flaws.” Verizon’s estimate is even higher, at 96%. Both agree that the vast majority of our customers’ security problems are caused by implementing security products that are too difficult to use. The answer? Put it all in one place and make it easy to manage. The best security in the world is USELESS unless you can manage it effectively.

Caro HR Manager: Storia di un CV e della sua privacy (GDPR lo dici a tua sorella)

Caro HR Manager: Storia di un CV e della sua privacy (GDPR lo dici a tua sorella)

Sono un poco preoccupato, perché la mia impressione è che in Italia, a fronte di una delle legislazioni più severe d’Europa e i nuovi vincoli introdotti od in via di introduzione dal GDPR, il concetto di privacy sia altamente sottovalutato.

Il problema ovviamente è insito nella storica sottovalutazione italica dell’impatto delle strutture informatiche all’interno dei processi produttivi, decisionali e manageriali.

Insomma non ci si interessa, non si capisce, e non si valuta. Di conseguenza non si correggono comportamenti errati e, allo stesso tempo, non si sfruttano le nuove possibilità rimanendo al palo delle nuove tecnologie con buona pace di chi (da Olivetti a Faggin, ma potremmo citare Marconi e Meucci) avevano fatto dell’Italia la piattaforma del nuovo.

Vabbè

Polemiche parte vediamo di capire con un esempio così semplice che persino un ufficio HR potrebbe capire, cosa significa gestire la privacy e la protezione del dato.

Ti arriva un nuovo CV: cosa hai capito della privacy e del GDPR?

Immaginiamo che il vostro ufficio HR riceva un CV di un possibile candidato. Cosa non tanto strana in tempi in cui la ricerca del lavoro è fondamentale (anche io ne ho mandati in giro centinaia recentemente).

Immaginiamo anche che il CV arrivi via posta elettronica (cosa abbastanza consueta) e che giusto perché ci sono posizioni aperte il medesimo venga fatto girare tra qualche potenziale interessato. Ad esempio l’hiring manager.

Non mi interessa in questo momento sapere come finirà la storia dell’essere umano dietro quel pezzo di carta, con i suoi bisogni, aspirazioni e potenziale. Mi interessa proprio il pezzo di carta, virtuale.

Come potete immaginare quel pezzo di carta contiene dati personali, in quanto sono riferibili ad una persona fisica.

OPS va a finire che per questo devo trattarli in maniera coerente alle disposizioni di legge? Va a finire che il GDPR (qualunque cosa esso sia) viene convolto?

Temo proprio di si.

CV, CV che farne?

Allora in teoria, ammesso e non concesso che tu sia interessato in qualche maniera ad essere allineato ai dettami di legge, dovresti processare questi dati di conseguenza.

Non voglio qui fare una dissertazione di dettaglio sul GDPR, ma mi limito ad alcune considerazioni banalotte, giusto per aiutarti ad evitare una multa salta.

Il cv in questione probabilmente finirà in:

  • Diverse caselle di posta
  • Come file in qualche cartella personale eo condivisa
  • Magari in un database se sei abbastanza grande da memorizzare i cv dei candidati
  • Stampato su qualche scrivania
  • ….

Ora siccome quel pezzo di carta (virtuale e non) contiene dati personali, e magari sensibili (che so il tuo ultimo stipendio, il tuo IBAN, l’indirizzo della tua amante … ) tu che lo ricevi dovresti avere in piedi un processo di gestione che tenga presente che questi dati devono:

  • essere conservati in maniera sicura
  • deve essere possibile per il proprietario dei dati (che non sei tu, è il tipo che ha scritto il cv) chiederne la modifica
  • deve essere possibile per il proprietario dei dati (ancora non sei tu) la cancellazione.
  • Dovresti anche essere in grado di determinare quale sia la vita, all’interno dei tuoi sistemi, di questi dati e l’uso che ne fai.

Che tu ci creda o meno questo richiede di avere dei processi definiti che riguardano la “vita” di quel coso che so, adesso, incominci ad odiare.

Insomma dovresti sapere cose del tipo (tra loro strettamente correlate):

per quanto tempo tengo questa cosa nei miei sistemi?

Come salvo questi dati?

Come li cancello?

Sembra facile ma tu sai veramente che succede ai CV che ricevi?

Hai definito una “retention policy” su questi dati?

Traduco, hai una regola standard che definisce quanto tempo puoi tenere questi dati? Mesi? Anni? Lustri? Per sempre? Ma che @#?§ vuoi che me ne freghi ammé?

Ok la ultima e la tua policy attuale lo so, ma temo non sia la risposta che meglio si adatta alla nostra legislazione.

Quanto tengo quell’oggetto ed i relativi dati in casa è importante perché:

  • Il dato, fino a che lo tengo, va gestito, conservato, protetto secondo legge
  • Ne sono legalmente responsabile
  • Quando lo cancello lo devo fare davvero

Ora il punto uno è già un punto dolente. Significa che tu dovresti sapere questa roba dove si trova nei tuoi sistemi. E non importa se in forma cartacea o elettronica….

Il punto è dolente anche perché ti impone di utilizzare tecniche coerenti per la protezione, salvataggio, recupero ed accesso al dato.

Insomma vediamo se riesco a spiegartelo: se lo salvi in un database o lo metti su una cartella, devi garantire in qualche maniera che l’accesso non sia concesso proprio a chiunque, anche all’interno della azienda.

Se poi ha iniziato a girare via email so che può essere complicato evitare che vada ovunque quindi, magari, sarebbe opportuno che queste cose le sappiano tutti in azienda, non solo lo sfigato di turno che deve farsi carico di sta roba pallosa che è la privacy.

Insomma di a chi lo ha ricevuto che va trattato in maniera adeguata, magari cancellandolo se non serve più, altrimenti come garantisci la adeguata protezione ed il ciclo di vita?

Poi, ovviamente, l’IT dovrebbe garantire la protezione anche da intrusioni esterne:

qualcuno la chiama cyber security,

altri sicurezza informatica,

tu “quelle cose li da tecnici che non ci capisco niente però ho l’antivirus che non aggiorno da 6 mesi perché mi rallenta il computer”

tu “quelle cose li da tecnici che non ci capisco niente però ho l’antivirus che non aggiorno da 6 mesi perché mi rallenta il computer”

In teoria dovresti anche avere sistemi di salvataggio e recupero adeguati. Una roba che si chiama backup e recovery, magari ne hai sentito parlare l’ultima volta che hai perso tutti i tuoi dati …

Il tutto perché se non lo fai e, disgraziatamente, ti becchi un ransomware, o ti entrano nei sistemi e finisci sui giornali perché hanno pubblicato la foto dell’amante che il tuo candidato che aveva messo sul cv, qualcuno che ti ha mandato il cv potrebbe porsi domande e chiedere conto delle tue azioni, e sai la cosa brutta quale è? … che non è tutta colpa dell’IT (fonte di tutti i mali, notoriamente) secondo la legge …

Lo odi sempre più sto cv vero?

Pensa che la cosa è ancora più complicata perché: si qualcuno si deve far carico dei backup, testare di tanto in tanto i restore.

Roba che il buon Monguzzi non finisce mai di ricordarci, ma che puntualmente ignoriamo J

Ma lasciami aggiungere un altro pezzettino. Se il dato lo cancelli deve essere cancellato davvero. Questo significa la cancellazione di tutte le copie presenti in azienda:

  • email
  • dischi
  • database
  • backups

Lo sapevi? No?

Se non lo sai sallo!

 

Privacy e gestione del dato

So di essere controcorrente scrivendo queste cose, e che hai cose molto più importanti a cui pensare. Ma se volessi potrei continuare parlando al tuo marketing, al tuo ufficio vendite, al tuo ufficio acquisti, a chi ti gestisce il sito web e probabilmente anche al tuo IT manager che se gli si dice GDPR risponde a te e tua sorella!

 

probabilmente anche al tuo IT manager che se gli si dice GDPR risponde a te e tua sorella!

Il punto è che queste cose sembrano complicate, ma in realtà non lo sono davvero. Basterebbe capire cosa significa integrare i dati nei processi aziendali, e disegnare i medesimi tenendo conto delle esigenze di legge, di business e della tecnologia corrente.

Certo significa anche che non puoi trattare la privacy come una rottura di scatole che non ti riguarda, esattamente come non dovresti fare con l’IT.

Pensaci e se eviterai una multa forse mi ringrazierai anche, finita la sequela di improperi che mi sono meritato per averti detto queste cose.Ciao

martedì 13 dicembre 2016

Pretty Good Privacy (PGP)

Pretty Good Privacy (PGP)

Pretty Good Privacy or PGP is a popular program used to encrypt and decrypt email over the Internet, as well as authenticate messages with digital signatures and encrypted stored files.
Previously available as freeware and now only available as a low-cost commercial version, PGP was once the most widely used privacy-ensuring program by individuals and is also used by many corporations. It was developed by Philip R. Zimmermann in 1991 and has become a de facto standard for email security.

How PGP works

Pretty Good Privacy uses a variation of the public key system. In this system, each user has an encryption key that is publicly known and a private key that is known only to that user. You encrypt a message you send to someone else using their public key. When they receive it, they decrypt it using their private key. Since encrypting an entire message can be time-consuming, PGP uses a faster encryption algorithm to encrypt the message and then uses the public key to encrypt the shorter key that was used to encrypt the entire message. Both the encrypted message and the short key are sent to the receiver who first uses the receiver’s private key to decrypt the short key and then uses that key to decrypt the message.

PGP comes in two public key versions — Rivest-Shamir-Adleman (RSA) and Diffie-Hellman. The RSA version, for which PGP must pay a license fee to RSA, uses the IDEA algorithm to generate a short key for the entire message and RSA to encrypt the short key. The Diffie-Hellman version uses the CAST algorithm for the short key to encrypt the message and the Diffie-Hellman algorithm to encrypt the short key.
When sending digital signatures, PGP uses an efficient algorithm that generates a hash (a mathematical summary) from the user’s name and other signature information. This hash code is then encrypted with the sender’s private key. The receiver uses the sender’s public key to decrypt the hash code. If it matches the hash code sent as the digital signature for the message, the receiver is sure that the message has arrived securely from the stated sender. PGP’s RSA version uses the MD5 algorithm to generate the hash code. PGP’s Diffie-Hellman version uses the SHA-1 algorithm to generate the hash code.

Getting PGP

To use Pretty Good Privacy, download or purchase it and install it on your computer system. It typically contains a user interface that works with your customary email program. You may also need to register the public key that your PGP program gives you with a PGP public-key server so that people you exchange messages with will be able to find your public key.

PGP freeware is available for older versions of Windows, Mac, DOS, Unix and other operating systems. In 2010, Symantec Corp. acquired PGP Corp., which held the rights to the PGP code, and soon stopped offering a freeware version of the technology. The vendor currently offers PGP technology in a variety of its encryption products, such as Symantec Encryption Desktop, Symantec Desktop Email Encryption and Symantec Encryption Desktop Storage. Symantec also makes the Symantec Encryption Desktop source code available for peer review.
Though Symantec ended PGP freeware, there are other non-proprietary versions of the technology that are available. OpenPGP is an open source version of PGP that’s supported by the Internet Engineering Task Force (IETF). OpenPGP is used by several software vendors, including as Coviant Software, which offers a free tool for OpenPGP encryption, and HushMail, which offers a Web-based encrypted email service powered by OpenPGP. In addition, the Free Software Foundation developed GNU Privacy Guard (GPG), an OpenPGG-compliant encryption software.

Where can you use PGP?

Pretty Good Privacy can be used to authenticate digital certificates and encrypt/decrypt texts, emails, files, directories and whole disk partitions. Symantec, for example, offers PGP-based products such as Symantec File Share Encryption for encrypting files shared across a network and Symantec Endpoint Encryption for full disk encryption on desktops, mobile devices and removable storage. In the case of using PGP technology for files and drives instead of messages, the Symantec products allows users to decrypt and re-encrypt data via a single sign-on.
Originally, the U.S. government restricted the exportation of PGP technology and even launched a criminal investigation against Zimmermann for putting the technology in the public domain (the investigation was later dropped). Network Associates Inc. (NAI) acquired Zimmermann’s company, PGP Inc., in 1997 and was able to legally publish the source code (NAI later sold the PGP assets and IP to ex-PGP developers that joined together to form PGP Corp. in 2002, which was acquired by Symantec in 2010).
Today, PGP encrypted email can be exchanged with users outside the U.S if you have the correct versions of PGP at both ends.
There are several versions of PGP in use. Add-ons can be purchased that allow backwards compatibility for newer RSA versions with older versions. However, the Diffie-Hellman and RSA versions of PGP do not work with each other since they use different algorithms. There are also a number of technology companies that have released tools or services supporting PGP. Google this year introduced an OpenPGP email encryption plug-in for Chrome, while Yahoo also began offering PGP encryption for its email service.

What is an asymmetric algorithm?

Asymmetric algorithms (public key algorithms) use different keys for encryption and decryption, and the decryption key cannot (practically) be derived from the encryption key. Asymmetric algorithms are important because they can be used for transmitting encryption keys or other data securely even when the parties have no opportunity to agree on a secret key in private.
Types of Asymmetric algorithms
Types of Asymmetric algorithms (public key algorithms):
• RSA
• Diffie-Hellman
Digital Signature Algorithm
• ElGamal
• ECDSA
• XTR

Asymmetric algorithms examples:

RSA Asymmetric algorithm
Rivest-Shamir-Adleman is the most commonly used asymmetric algorithm (public key algorithm). It can be used both for encryption and for digital signatures. The security of RSA is generally considered equivalent to factoring, although this has not been proved.
RSA computation occurs with integers modulo n = p * q, for two large secret primes p, q. To encrypt a message m, it is exponentiated with a small public exponent e. For decryption, the recipient of the ciphertext c = me (mod n) computes the multiplicative reverse d = e-1 (mod (p-1)*(q-1)) (we require that e is selected suitably for it to exist) and obtains cd = m e * d = m (mod n). The private key consists of n, p, q, e, d (where p and q can be omitted); the public key contains only n and e. The problem for the attacker is that computing the reverse d of e is assumed to be no easier than factorizing n.
The key size should be greater than 1024 bits for a reasonable level of security. Keys of size, say, 2048 bits should allow security for decades. There are actually multiple incarnations of this algorithm; RC5 is one of the most common in use, and RC6 was a finalist algorithm for AES.

Diffie-Hellman
Diffie-Hellman is the first asymmetric encryption algorithm, invented in 1976, using discrete logarithms in a finite field. Allows two users to exchange a secret key over an insecure medium without any prior secrets.

Diffie-Hellman (DH) is a widely used key exchange algorithm. In many cryptographical protocols, two parties wish to begin communicating. However, let’s assume they do not initially possess any common secret and thus cannot use secret key cryptosystems. The key exchange by Diffie-Hellman protocol remedies this situation by allowing the construction of a common secret key over an insecure communication channel. It is based on a problem related to discrete logarithms, namely the Diffie-Hellman problem. This problem is considered hard, and it is in some instances as hard as the discrete logarithm problem.
The Diffie-Hellman protocol is generally considered to be secure when an appropriate mathematical group is used. In particular, the generator element used in the exponentiations should have a large period (i.e. order). Usually, Diffie-Hellman is not implemented on hardware.

Digital Signature Algorithm
Digital Signature Algorithm (DSA) is a United States Federal Government standard or FIPS for digital signatures. It was proposed by the National Institute of Standards and Technology (NIST) in August 1991 for use in their Digital Signature Algorithm (DSA), specified in FIPS 186 [1], adopted in 1993. A minor revision was issued in 1996 as FIPS 186-1 [2], and the standard was expanded further in 2000 as FIPS 186-2 [3]. Digital Signature Algorithm (DSA) is similar to the one used by ElGamal signature algorithm. It is fairly efficient though not as efficient as RSA for signature verification. The standard defines DSS to use the SHA-1 hash function exclusively to compute message digests.
The main problem with DSA is the fixed subgroup size (the order of the generator element), which limits the security to around only 80 bits. Hardware attacks can be menacing to some implementations of DSS. However, it is widely used and accepted as a good algorithm.

ElGamal
The ElGamal is a public key cipher – an asymmetric key encryption algorithm for public-key cryptography which is based on the Diffie-Hellman key agreement. ElGamal is the predecessor of DSA.

ECDSA
Elliptic Curve DSA (ECDSA) is a variant of the Digital Signature Algorithm (DSA) which operates on elliptic curve groups. As with Elliptic Curve Cryptography in general, the bit size of the public key believed to be needed for ECDSA is about twice the size of the security level, in bits.

XTR
XTR is an algorithm for asymmetric encryption (public-key encryption). XTR is a novel method that makes use of traces to represent and calculate powers of elements of a subgroup of a finite field. It is based on the primitive underlying the very first public key cryptosystem, the Diffie-Hellman key agreement protocol.
From a security point of view, XTR security relies on the difficulty of solving discrete logarithm related problems in the multiplicative group of a finite field. Some advantages of XTR are its fast key generation (much faster than RSA), small key sizes (much smaller than RSA, comparable with ECC for current security settings), and speed (overall comparable with ECC for current security settings).
Symmetric and asymmetric algorithms
Symmetric algorithms encrypt and decrypt with the same key. Main advantages of symmetric algorithms are their security and high speed. Asymmetric algorithms encrypt and decrypt with different keys. Data is encrypted with a public key, and decrypted with a private key. Asymmetric algorithms (also known as public-key algorithms) need at least a 3,000-bit key to achieve the same level of security of a 128-bit symmetric algorithm. Asymmetric algorithms are incredibly slow and it is impractical to use them to encrypt large amounts of data. Generally, symmetric algorithms are much faster to execute on a computer than asymmetric ones. In practice they are often used together, so that a public-key algorithm is used to encrypt a randomly generated encryption key, and the random key is used to encrypt the actual message using a symmetric algorithm. This is sometimes called hybrid encryption

Dataprivacyasia: Antonio Ieranò at Asia's premier data protection, privacy and cybersecurity conference. Watch videos

Missed @AntonioIerano at Asia‘s premier #dataprotection, #privacy and #cybersecurity conference? Watch videos

— Data Privacy Asia (@dataprivacyasia) December 10, 2016
from http://twitter.com/dataprivacyasia
http://twitter.com/dataprivacyasia/status/807542383512588288

 

martedì 6 dicembre 2016

Maslow hierarchy of needs applied to employee engagement

maslow1
source https://www.simplypsychology.org/maslow.html

Maslow hierarchy of needs can be applied to employee engagement; it is an interesting exercise since help us to understand why we should implement correction in our management style in order to retain talents and best performers inside the company.

The 5 level of Maslow can be somehow translated into the engagement level of the employee as showed in the image.

So let try to understand the 5 levels.

This five stage model can be divided into deficiency needs and growth needs. The first four levels are often referred to as deficiency needs and the top level is known as growth or being needs.

The deficiency needs are said to motivate people when they are unmet. Also, the need to fulfil such needs will become stronger the longer the duration they are denied. For example, the longer a person goes without food the more hungry they will become.

One must satisfy lower level deficit needs before progressing on to meet higher level growth needs. When a deficit need has been satisfied it will go away. Our activities become habitually directed towards meeting the next set of needs that we have yet to satisfy. These then become our salient needs. However, growth needs continue to be felt and may even become stronger once they have been engaged. Once these growths needs have been reasonably satisfied, one may be able to reach the highest level called self-actualization.

Any work environment follow more or less the same dynamics since is human related, therefore can be done a transposition of the basic human needs in terms of employee behavior. In company terms this would mean try to understand what the engagement level of the employee is accordingly to hisher satisfaction level.

maslow2

5 Survival

Starting bottom up we find the first level is the “survival”, the covering of immediate needs.

In this category we find the most disengaged employee, the ones that can not fit into the company culture and management and, basically, stay there because of the money and because they have no other choices.

This kind of employee does not have any kind of “attraction” or “affection” with the company, the work there is just a mere question of surviving. Of course heshe will leave at the first possible chance unless the barrier to mobility (cultural or economic) is too high.

It is interesting to notice that this kind of attitude can be driven by 2 factors:

  • Cultural
  • Management driven

The cultural approach is present when there is no perception of value of the work, but just a mere fulfillment of economical needs. But on the other end a very bad management attitude can drive people to this level in a very shorten time, when expectation on value, respect, trust and ethics are not met by the company management.

From a company point of view this is a very dangerous zone since this kind of employee does not find any reward or satisfaction on the job, not can see any possibility to rise up hisher status, not at least in that environment.

If this can be a problem with mere low level operative roles, is absolutely negative for higher company function, or whenever a “commitment” is required as, it is an obvious consideration, in case of knowledge workers.

4 Security

This is a common situation of “not engaged” people. People that have its own work ethics but can’t find in the company the needed fulfillment; therefore take as only value the compensation.

This is usually linked by a poor management environment, as a matter of fact people in this zone does not feel that are using their skills appropriately, and does not feel that the job, the management or the team they are inserted is the right place to be.

Typically this kind of employees offer a higher service for the company than the previous one, but their cultural need to find satisfaction in their job tend to move them to look for new possibilities. This can not be necessary a condition matched by a higher level of compensation, often is strictly related to job conditions itself.

Micromanagement andor autocratic management styles are usually the cause of this not engagement. Whilst there is not sense of belonging nor affection to the company, the compensation level is enough to keep the employee as long as he she does not find a more appealing working condition.

Those first two levels are intrinsically demotivating, and can affect working performance. While in the first level expectation can not exceed the minimum required level to do the job, on the second one can be present more performing results due to the possibility to gain more “money” as a compensation of the low job esteem.

3 belonging

According to Maslow’s model below the two basic levels that fill the basic needs, we find the psychological needs.

This means, basically that once the minimum level of services has been reached people tend to satisfy needs that would be, otherwise, somehow out of reach, as love and friendship.

In business terms this can be express by a sense of belonging in the company and therefore an active engagement.

The next 3 levels describe a situation where the employee find its satisfaction inside the company and try to fulfill further needs inside the company itself.

This is the main different with the first 2 stages, in the first 2 any upgrade or fulfillment of higher desires is seen only going outside the company, bringing as consequence the low or neutral engagement, while in the next three the perception is that the satisfaction of higher needs can be found inside the company itself, this, coped with the natural desire of human being to improve, can bring high value in terms of quality and willingness to succeeds.

In other terms people is motivated.

In the first of the 3 psychological stages employees feel a sense of belonging, but needs are not completely satisfied. With the sense of “proudness” there is also the disenchanted look at the market because there can be a better position.

The good part comes from the management and the quality of the work, while the main obstacle in this case is mainly related to possible career path.

In absence of a clear career path the natural need to rise up the satisfaction level can bring the employee to look elsewhere.

While this is usually a passive openness to move, if the career path are closed the employee can feel a sense of betray and shift hisher perception to level 4. The most appealing job offer would, in this case, offer a sensitive paycheck raise and a better, more prestigious, position.

The work performance are, in any way, usually very good since the employee feel a rewarding coming from the job and the surrounding environment.

4 importance

Right above the belonging level we can find what Maslow define as esteem needs.

This is usually achieved in business when the employee feel itself as an important part of the organization, he is rewarded and achieved. The feeling of “being able to make the difference” and the perception that a growth path is possible make this kind of employee highly engaged, motivated and motivator with the colleagues.

The point is that the perception of being part of that team, that environment, that company is rewarding by itself and makes the employee proud. Usually people in this state would move only for a “lifetime offer” and not tempted by small pay rise. The difference has to be “important” both in terms of money compensation and role.

1 self actualization

This is when the employee finds its own meaning inside the company. The satisfaction level is the highest since the perceptions are that the company fulfills all needs economical and, most important, psychological.

This means the employee feel to be part of the group, is proud, fell can make the difference and even hisher creative part is stimulated.

In this stage it is very unlikely that an employee want to leave, and his her commitment to the company and the job is the greatest.

Clearly this is quite a hard status to achieve.

Why Maslow’s hierarchy of needs matter

We can ask ourselves what this has to do with job? Well knowing what can motivate demotivate someone can influence heavily his her work performances as well his her retention.

HR that are looking to hire, Managers that want to increase team performances can find in this model a “simply” way to understand what to do or what to offer.

One of the interesting parts of this approach is that accordingly to mallow theory the natural need to fulfill higher level of needs can be used inside an organization to promote the sense of belonging and the overall performance in terms of results and quality. Rewarding and fulfilling needs pay off.

But a rewarding approach is a completely different approach from the commonly used punitive approach common in many bad management practices.

It is easy to understand how bad people management practices put people in the lowest level of the hierarchy (5 and 4) and affect heavily the quality of the job done.

But reaching level 3 requires a big commitment in terms of human resources and management effort. The payoff is usually enough to justify the effort, or at least this seems to be the approach of modern high tech companies where commitment and dedication need to go hand in hand with risk taking, creativity and high skills.

The lower the need for a company to provide quality, flexibility and creativity the lower will be the need for the company to move to satisfy levels above 4.

But just to be clear, being below level 3 in a tech company expose the company itself to lower productivity and lower retention right where the most talented resources in terms of skills and motivation are needed.

About Maslow’s hierarchy of needs

Maslow’s hierarchy of needs is a theory in psychology proposed by Abraham Maslow in his 1943 paper “A Theory of Human Motivation” in Psychological Review. Maslow subsequently extended the idea to include his observations of humans’ innate curiosity. His theories parallel many other theories of human developmental psychology, some of which focus on describing the stages of growth in humans. Maslow used the terms “physiological”, “safety”, “belongingness” and “love”, “esteem”, “self-actualization”, and “self-transcendence” to describe the pattern that human motivations generally move through.

Maslow studied what he called exemplary people such as Albert EinsteinJane AddamsEleanor Roosevelt, and Frederick Douglass rather than mentally ill or neurotic people, writing that “the study of crippled, stunted, immature, and unhealthy specimens can yield only a cripple psychology and a cripple philosophy.” Maslow studied the healthiest 1% of the college student population.

Maslow’s theory was fully expressed in his 1954 book Motivation and Personality. The hierarchy remains a very popular framework in sociology research, management training and secondary and higher psychology instruction.

lunedì 5 dicembre 2016

Micromanagement vs. experts: the original sin in organizations

bt2_8163I was reading an interesting article about micromanagement (Why Is Micromanagement So Infectious?) that forced me to write again on management issue.

My interest is on the implication of a micromanagement attitude on a team with a focus on expert management.

In business management, micromanagement is a management style whereby a manager closely observes or controls the work of subordinates or employees. Micromanagement generally has a negative connotation.

Micromanagement classic symptoms are the lack of delegation, the imposition of company rules regardless their effectiveness or fairness, the not contextualization of task and goals, the trivial focus on the lesser details or procedural trivia, the so called “reportmania” , the continuous references to prove even the most obvious statement and so on

Micromanagement is born in environment where static and well defined sequences of activities were the only needed request to employee to perform hisher duty. While micromanagement can find its reason in old production environments, in modern world business management has presented different management techniques in order to address a different kind of employee, the knowledge worker.

The reason behind this shift of focus is that the employees have to confront not a static production environment therefore need to quickly adapt and take ownership of decisions needed in a very short-term. This requires a different set of expertise and skills that forced business management theory to introduce the concepts of “knowledge worker”.

Knowledge workers

A knowledge worker is, using the Wikipedia definition:

Knowledge workers are workers whose main capital is knowledge. Examples include software engineers, physicians, pharmacists, architects, engineers, scientists, public accountants, lawyers, and academics, whose job is to “think for a living”
Those kinds of workers are required to address a deeper and wider spread of knowledge needed to address current complex and evolving environment, where static rules and approach would be less effective. The value of a knowledge worker is hisher knowledge which should be used to address new and unknown problems, optimize previous process, open new market and so on.
In an environment where knowledge working is fundamental for organization survival micromanagement is the classical portrait of a bad manager … why is this?
Because a manager in a knowledge working environment should manage resources giving them autonomy, trust and resources to accomplish their assigned goals, otherwise would not be managing effectively knowledge worker resources.
This is a serious issue in every company that makes of innovation and technology its reason, since micromanagement does not come well with creativity which is a mandatory requirement for innovation.

Why this is bad in high-tech environment?

The idea behind micromanagement is associated to 2 nefarious assumptions:
1) The employee cannot be trust
2) The manager knows better how to do the job.

Let see in detail what those assumption means:

Trust issue:

Trust is a bidirectional relationship, as respect or, outside working realm, friendship.

Not giving trust means not receiving trust. This affects, basically, the whole environment.

In a team environment, which is the basic requirement to justify the need of a manager, lack of trust consequence is to collapse on start any real collaboration between team members that is not strictly imposed or previously codified.
The resulting dynamics affect flexibility and creativity which is deadly in a complex ever-changing environment like the ITC one.

Another problem related to the trust issue associated to micromanagement is that without delegation there is no assumption of responsibility and therefore there is a tendency to avoid any risk.

While risk minimization can seem to be a good thing, the problem is that not taking any risk means not doing anything different or new. This is the quickest way to block growth and evolution, which are essential for an organization to survive.

In a micromanagement environment subordinated avoid taking responsibility and risks due to the management attitude which does not price this as a value. This attitude runs to the entire control chain or hierarchy, typically shifting blame towards lower levels which, on the other end, does not have ways to change things due to the micromanagement attitude and constrains.

Moreover from an ethical point of view would worth to ask ourselves what are the basis that made a manager more trustable than one of hisherits reports. Considering, in particular, the knowledge workers we are talking in most of the cases of seasoned professionals that have provided their services in several environment; the necessity of strong ethics and commitment are necessary for that kind of activity and, by the way, in the last 30 years more and more studies showed how the assumption that managers works for the sake of the company greater good is not compliant to reality.

Knowledge issue:

In older production environments most of the knowledge was related to the experience maturated doing a specific manual task. The classical application has been, historically, the introduction of the assembly or production lines. In those kinds of environments the need for team management was less strict; since any member was having a predetermined set of actions and defined skills, while all the decision process was demanded to the upper layer, micromanagement was an acceptable behavior and was, at a certain extent, the way to transmit knowledge to the new employees.

In this scenario was natural to assign middle management function to employees based on experience maturated inside the company, since the company and the product or assembly line was the only given reference.

While the assumption the manager is more knowledgeable of hisher report can be truthful in a not knowledge worker environment, by its nature a knowledge worker environment require a deeper breath of skills that cannot be collected in a single source.

The reason is basically connected to the two dimension of knowledge, wide and deep.

Micromanagement is not possible if the deep or the wide of the knowledge required exceed manager knowledge, which is a common situation. As a result micromanagement shift its focus on trivial aspects not strictly related to the goal.
The whole point of expertise is to fill the gap for the organization; if the manager would be able to fill this gap knowledge workers would not be necessary.

Probably the best quote against micromanagement attitude can be taken from a famous former Apple CEO Steve Jobs statement.

“…it doesn’t make sense to hire smart people and tell them what to do; we hire smart people so they can tell us what to do.”

This exemplify in an excellent way why micromanagement is not a good idea when dealing with knowledge workers.
Management is a complicated issue

Micromanagement is not the only portrait of a bad manager as lack of delegation is not just the only portrait of a micromanager. But for sure a micromanager is a bad manager, while not being a micromanager does not means automatically you are a good manager.
Alas in absence of micromanagement as the management style the manager have to find a way to manage, motivate, reward, help, support, and give goals to hisher team members.

The most complicated part is that in modern knowledge working environment some or even all of the team members can have higher seniority in terms of knowledge, age and experience of the direct manager itself, which makes micromanagement, as well as other bad management common practices, not only not practical but even counterproductive.

When a company hire expertise is hiring a knowledge worker, this means to adopt the correct management style.

A correct management style means to start working on goals and targets (forget the damn KPI for once and start thinking as a professional), defining jointly the requirements (which means the level of autonomy, the delegated authority needed, the sponsorship, the credentials among other groups and so on) and setting in a correct way the operative environment.

If this is done micromanagement is absolutely nonsense, if this is not done using experts is absolutely nonsense.

Industry 4.0. a cultural revolution before a technology one

Industry 4.0. a cultural revolution before a technology one

by Antonio Ieranò

c5d0ea88-fca7-496b-90d3-f8fae042e105-large

We are used to deal with linguistic expressions consisting of a name and two wagered numbers whose second is a zero: Type 2.0, 3.0, 4.0 and so on.
Referenced in ascending order, the figures would suggest an evolution, a transition to a more advanced version (or up to date one) of a given situation or a certain object. Among the first to show itself, and the best known not only among the experts there is definitely “web 2.0”. It is a fascinating phenomenon from an ideal point of view; it has created a cultural movement that changed heavily our world, and has triggered a lot of discussions about the future of our society but, from a technological point of view, is basically empty, devoid of content.

What the Web 2.0 brought as an extraordinary novelty was the change of approach on the internet, with the shift from a system in which only a limited number of content providers produced and delivered content to another mode that, on the contrary, it provided and it favored the emergence of a growing community range of users, each of which are able not only to produce but also to share – or to network – these contents.

In a sense, the Industry 4.0 is not different from the above mentioned web 2.0: more than a technological revolution – the digital age is certainly not a novelty in these last few years – we must talk about new attitude and fresh approach on how to make industry ,how to produce.
An attitude with strong links to issues mainly related to roles and processes than technology, that involves much less technical staff and much more key figures in the company as the Chief Financial Officer or the Chief Executive Officer, roles that in the enterprise ecosystem outline the strategies and take decisions, choosing one direction and not another.
Working in a company that provides the backbone for the Industry 4.0, as ICT devices and the related tools, I am firmly convinced that, for a company, it is important to have a project. Each ICT or software implementation without serious and structured ideas behind it is absolutely useless, if not harmful.

That’s why the Industry 4.0 is primarily the need or ability to define within the company, whatever vertical or size, whatever the economic impact, a new path for resources management. And this means management and integration of every resource, from energy production to ICT, and so on.

The Industry 4.0 is a great idea thanks to which all objects and all the subjects that are part of a company stop being isolated entities and become connected and not only as a matter of physical connection or communication one, but as a real matter of processes and management.

In this sense, the interconnection means that all objects and subjects – all “united-connected” – must be able to work together to deliver a result.
It is obvious that to jointly work and guarantee a result, properly working devices and tools (hardware and software) are needed, as the connectors for wiring, sensors for monitoring data, big data and data quality systems analysis, up to IT security systems.
But those elements, although important, are not decisive to get to a full result. What comes before the good functioning of those tools is the ability to integrate technology into processes and – at the same time – into the corporate culture. In other words, this means that the company will be ready to leverage the best results of what new technology can generate, that means outputs functional and strategic to the enterprise needs itself.

This is not happening at the moment, one example above all: the amount of data that the interconnected objects produce remains unused or underutilized due to poor analytical skills.

The Industry 4.0 is revolutionary in its being an element of rupture with respect to the consolidated industrial models. And this is true both for large groups, where each intervention has greater repercussions (just think of interventions to improve energy efficiency) and SMEs.

In Italy, in particular, it is important that small and medium enterprises equip themselves with the cultural tools to understand where to intervene to become or stay competitive in a global drastically changed landscape. This means knowing how to choose the solution that best suits your needs and the system that best fits with your strategic growth plans.
In the market there is no shortage of choices: legacy platforms, cloud services, guidance counselors, outsourcing… Each choice has advantages and disadvantages: the important thing is that even in a small business reality there is someone who has a broader view, a medium to long-term view.

How it will be, then, this passage Industry 4.0? Probably slow; a path made by small steps because of the cultural reasons cited above and for purely economic reasons, taking into account the considerable costs for the adaptation of production environments to the new standards.

No doubt this will be an inevitable path and the sooner we will start to think in a new way, the sooner we will recover global competitiveness as system-country.

Industria 4.0. Rivoluzione culturale prima che tecnologica

Industria 4.0. Rivoluzione culturale prima che tecnologica

c5d0ea88-fca7-496b-90d3-f8fae042e105-large
Siamo ormai abituati ad avere a che fare con espressioni linguistiche costituite da un nome e due numeri puntati il cui secondo è uno zero: tipo 2.0, 3.0, 4.0 eccetera. Messe in ordine ascendente, le cifre dovrebbero suggerire un’evoluzione, un passaggio verso una versione più avanzata (o aggiornata) di una data situazione o di un certo oggetto.
Fra le prime ad imporsi e più note non solo fra gli addetti ai lavori c’è sicuramente “web 2.0”. Si tratta di un fenomeno affascinante dal punto di vista ideale, che ha fatto cultura, che ha dato l’avvio a molte discussioni sul futuro delle nostre società ma che da un punto di vista tecnologico è sostanzialmente vuoto, privo di contenuti. Ciò che il web 2.0 portava come straordinaria novità era il cambio di approccio all’uso della rete, con il passaggio da un sistema in cui solo un numero limitato di content provider produceva e forniva contenuti, ad un’altra modalità che, invece, prevedeva e favoriva la nascita di una comunità sempre più allargata di utenti, ognuno dei quali in grado non solo di produrre ma anche di condividere – o mettere in rete – questi contenuti.
In un certo senso, l’Industria 4.0 non è differente dal sopra citato web 2.0: più che di rivoluzione tecnologica – il digitale non è certamente una novità di questi ultimissimi anni – si deve parlare di nuovo atteggiamento o rinnovato approccio alle modalità di fare industria, di produrre. Un atteggiamento con forti legami a questioni di ruolo e di procedura che coinvolge molto meno il personale tecnico e molto più figure chiave in azienda come il direttore finanziario o l’amministratore delegato. Personaggi che nell’ecosistema aziendale delineano le strategie e prendono le decisioni, scegliendo una direzione piuttosto che un’altra.

Operando in una compagnia che di Industria 4.0 fornisce il backbone, cioè l’informatica e quegli strumenti che servono a collegarsi, sono fermamente convinto di quanto, per un’azienda, sia importante avere un progetto. Ogni implementazione di software senza un’idea seria e strutturata alle spalle è assolutamente inutile, se non dannosa.
Ecco perché l’Industria 4.0 è innanzitutto la necessità o la capacità di definire all’interno dell’azienda, qualunque essa sia, qualunque sia l’impatto economico, un percorso di nuova gestione delle risorse. E qui si intende gestione e integrazione di tutte le risorse, da quelle energetiche a quelle produttive a quelle informatiche e così via.
L’Industria 4.0 è una bellissima idea grazie alla quale tutti gli oggetti e tutti i soggetti che fanno parte di un’impresa smettono di essere isolati e diventano interconnessi. E non solamente come connessione fisica o di comunicazione, ma come vera e propria questione di processo. In questo senso, l’interconnessione vuol dire che tutti gli oggetti – fra loro “uniti” – devono poter lavorare insieme per fornire un risultato.
Ovviamente per poter operare in modo congiunto e per garantire un risultato servono dispositivi e strumenti (hardware e software) in grado di ben funzionare, dai connettori per collegamenti, ai sensori per monitoraggio dati, ai sistemi di analisi big data e di qualità del dato, fino ai sistemi di sicurezza informatica. Elementi che pur importanti, non sono decisivi per arrivare a un risultato pieno. Ciò che viene prima del buon funzionamento degli strumenti è la capacità di integrare la tecnologia nei processi e questi – a loro volta – nella cultura aziendale. In altre parole, significa che l’impresa è preparata su come utilizzare al meglio (ovvero in modo funzionale e strategico all’attività dell’impresa stessa) ciò che le nuove tecnologia potranno generare.
Un esempio su tutti: la mole di dati che gli oggetti interconnessi producono rimane inutilizzata o sottoutilizzata a causa di scarse capacità di analisi.
L’Industria 4.0 è rivoluzionaria nel suo essere elemento di rottura rispetto al modello industriale consolidato. E questo discorso vale tanto per i grandi gruppi, dove ogni intervento ha ripercussioni maggiori (basti pensare agli interventi di efficientamento energetico) sia per le PMI.
In Italia, in particolare, è importante che la piccola e media impresa si doti degli strumenti culturali per capire dove intervenire per diventare o rimanere competitiva in un panorama mondiale di forte cambiamento. Ciò significa saper scegliere sia la soluzione più adatta alle proprie esigenze sia il sistema che meglio si sposa con i propri piani strategici di crescita. E le offerte non mancano: piattaforme di proprietà, servizi cloud, affiancamento di consulenti, affidamento in outsourcing. Ogni scelta ha vantaggi e svantaggi: l’importante è che anche in una piccola realtà imprenditoriale vi sia qualcuno che abbia una visione più ampia, a medio-lungo termine.
Come sarà, dunque, questo passaggio all’Industria 4.0? Probabilmente lento, a piccoli step sia per le ragioni culturali sopra citate, sia per motivazioni più squisitamente economiche, considerando i costi non indifferenti per l’adeguamento della produzione a ai nuovi standard.
Senza dubbio sarà inevitabile e prima si inizierà a pensare in modo nuovo, prima recupereremo come sistema-Paese competitività a livello globale.