Privacy has finally begun to engage – anything. We need more dedication and a better understanding of what a good privacy entails or what the lack of this can lead to. On one page we have those who are struggling to implement the GDPR and that often seem certain aspects of this are some unnecessary nonsense, on another side we have them that find it an obstacle to get hold of enough data to train their models and there is even a page That seems the GDPR is insufficient even if it is somewhere in the right direction. What exactly is the problem?
To make a solution to a problem is the first thing to do to figure out what the problem is. Privacy is a complex and far-reaching theme that requires one to pause and raise your eyes on what is set to solve on a day to day basis. The most obvious is perhaps to gain control of your data so that no information is lost and that you have participated in the value creation's own data leads to. This is important to gain control of, but perhaps the most important thing is that we as civilization are about to maneuver ourselves into a state where we do not have an overview of how we are affected and we completely lack a digital refuge where we can pull ourselves back And be sure that no one peeks us over the shoulder at all times. The opportunity to retire to a refuge or to be able to be "behind the scenes" where one can be themselves without worrying about having to sustain any social façade is important for having an experienced good life and for creating new ideas. This ability is limited by inviting surveillance mechanisms closer and closer to us.
Civilization is, if not directly skirts, but at least a vulnerable construction. Much of what makes our civilization so well-functioning is the rights we have dedicated to the people who make civilization. These are rights that have been fought for up through history. These have led to a better life for the individual people who in turn lead to growth and an overall higher value creation. This is now threatened by market mechanisms that live by collecting all of the behavioral data they can obtain and using this to sell predicted behavior and in extension of it, ruled actions. This is a serious problem for both democracy and for the individual's autonomy. Some of the challenge is that this happens in closed ecosystems where we which form the raw material through our actions are not a direct part of the ecosystem. What is going on is not only difficult available, it is kept actively hidden from us. Who knows, who decides and who decides who decides?
The companies that deal with our behavioral data and their derivatives are becoming the new clergy. They are missionary about what benefit services they offer, how inevitable the direction of development is, and the knowledge of what happens under the hood. As with previous conquests, we willingly sell for glass beads. It is worth noting that those who argue that development goes in an inevitable direction also have the most to gain from that it goes in just that direction. Similar to tobacco producers, they use known human weaknesses to make money. Looking back in time, one can sense how the boundaries of what one perceives as acceptable handling of personal data slowly but deliberately have been moved. All what these companies make themselves is – rationally enough – to reinforce their own business model. All the services offered for free are created with the eye to collect behavioral data about us and the service development goes in the direction of conquering new land by going into every part of our life. The more they know about our preferences and when we are receptive to a message the better the product they can sell to their customers. Unwanted behavior or behavior they cannot monitor is something that loses money on and the result that this is that we are being manipulated to follow their business model which, in effect, leads to less free will.
The combination of not having a refuge and continuous manipulation is not just bad for the individual, it is also unfortunate for business. Knowing that everything you make digitally is caught by others makes the most of them keep themselves a little bit back, you get a cooling effect. The effect of being manipulated into a self-reinforcing loop of their business model is even more negative; Each individual receives less potential value. The potential of the individual is, in greater and greater measure, determined by the ability of innovation and creativity. In a world where physical work and route-based tasks are better maintained by machines, it is the innovation and humanity we have something to contribute. Businesses that join the business model help make it stronger and undermine themselves in a longer perspective. Companies that attempt to settle with the same business model are doomed to err and end up being failures or, if the owners have luck, are assimilated by the big ones. If the plan is not to allow a work of techno to control the world's population, with greater and greater proportion of unemployed people, such as wild sheep, it is to take life of innovation and determination to think unrationally when viewed in a larger context. If you have ideas for how to create a good future for people, it is necessary to check that it does not end with an enlightened group taking care of the control over everyone else. There is no historical coating for it to work and it is difficult to imagine how something should be successful. It is a major problem that totalitarian regimes or surveillance capitalists are driving technological developments further. There is little reason to believe that technological solutions with artificial intelligence are going to treat people in a good way when the people who develop the technology are not further concerned than using us as batteries for their ambitions of "everything under The sky. "
What would it mean if we had a right to decide over our own data? Not as in the GDPR where there are certain rights that you have in relation to how companies are dealing with personal data, but a right that is based on us as human beings. Based on the challenges to be solved and team technology to solve these-not the other way around, what does one have to put in place if one should have this seemingly simple right? What good will it do for humanity to have such a right and is there any negative consequence?
We can try to break the right down in statements that are a direct consequence: one must be able to determine what its own data can be used for, what data to use, where its own data is to be used and how its own data is used. A consequence of this is that the model where one stores its personal data in a myriad of services as the situation is today, is that one needs a mechanism for having a simple and unified way of managing their data otherwise drowning the right in fragmentation and Incompatibility. Should one be able to choose to use their data where one would need the functionality to access the data be accessible to everyone everywhere. It's hard to see how this can be done if it's owned by a company or a nation. Both companies and nations are reluctant to give competitors access to data-though, according to the GDPR, it is the individual's property regardless-and they have priorities that are not necessarily in the first line to protect each user's data. It appears that a solution is handled by an organization or consortium that itself does not have a direct commercial interest in the data and that the solution involves a personal data warehouse that allows you to carry their data where you want it, pointing out as a good way To solve this.
Having one place to deal with enables to have an overview of what data one has given to whom. If you combine this by allowing third-party services to communicate with the user via temporary links that use clean self-cutting, you can both keep track of what data has been given to whom and only allow the services that the user has granted access, be allowed to Send messages. If the user is not interested anymore, the connection can be cut. Where it is desirable or necessary with two-sided agreements this can also be handled.
Adding more control to users will allow services that want to communicate with a user to do so with the goodwill of their users. The services will have to find other ways to get attention than to carpet bomb messages in the hopes that some bite on and they must continue to treat the users in a way that makes them want further contact. The use of temporary links makes it much more possible to see where a message comes from. The bias in the knowledge of what is going on will be able to be corrected.
The eventual negative effect of this will be for companies that have a business model that relies on collecting data from and influencing users without users being aware of it. The potential for value creation should be greater than the current model, in that the use is not cooled and that more services may have access to data if they treat the user with respect and ask appropriately. For producers of goods and services This should lead to the fact that one can become more targeted to their customers and cut down the proportion of the price that goes to get the user's attention.
The protection this gives the user and the ability to remain anonymous allows for a digital refuge for every individual – this is a solution or goal we call HAVEN (Humanely Attuned Virtual Ecosystem Nexus) – which is now more a set of criteria than a functional Platform. But it can be used as a tool to push the world in that direction by creating a most objective evaluation framework that can be applied to services that process personal data.
If we do this right, we can create the digisphere to be a place that takes care of people and helps humanity reach new heights. But it requires us to stand together and use our combined power to get there. To look big on it; It is time that we put aside petty quarrels and realize that we have a planet to drift!