The algorithm that discriminates against riders: the fine from the Privacy Guarantor arrives in Glovo
The algorithm that discriminates against riders
Who can say that they have never used the home delivery services for food, drinks or any other type of product? The phenomenon of home delivery platforms, widespread for years now, bases its success on those who make all this possible in practice: the riders.
These subjects, for years fighting for the recognition of better working conditions dignified and in line with the regulatory provisions of the sector, they are facing yet another setback to their rights, this time however in terms of privacy! On this aspect comes the first important stock to one of the major competitors in the delivery sector, the corporate group GlovoApp23, in this case the company Foodinho S.r.l. , by the National Data Protection Authority.
Given the peculiarity of the mechanisms for managing the work of riders, the implications for the protection of personal data are innumerable. The large companies that offer these services, in fact, are based exclusively on digital platforms which - as it is inevitable - “move” an indefinite amount of information and therefore also of personal data. Attention we are not talking about social networks and the cases already analyzed on Tiktok or Clubhouse. Here the right of the algorithm comes into play and consequently the possible damage from the algorithm, a topic already dealt with on numerous occasions.
Where does the case come from?
It all started in 2019, when the Italian Data Protection Authority has begun a complex inspection activity regarding the effective compliance by Foodinho Srl with the provisions on the protection of personal data regarding the data of its workers, in particular of the riders.The long investigation carried out, through the cooperation of the Italian and Spanish authorities (AEPD), has revealed how the profiles of infringement of the rights and freedoms of riders in terms of privacy are numerous and in some cases also very serious. We therefore start from basic elements, such as the minimum content of an information, which serves to make the interested party aware of all the specifications relating to the processing of his personal data, to arrive at the total omission of safeguards provided for very delicate processing activities. , such as profiling.
Geolocation and profiling of riders' enemies
One of the most complex issues certainly concerns the geolocation activity of riders, connected to that of profiling and the consequent use of automated decision-making mechanisms. The geolocation of the riders, used to manage the distribution of orders and deliveries more efficiently, actually hides various pitfalls from the point of view of privacy. The tracking of the movements of the subject, the conservation of the relative data and their use are all points on which the Foodinho company has proved to be very deficient in terms of protecting the personal data of workers.This violation in this matter protection of personal data is also linked to the use of this information to determine the criteria on the basis of which the rider may or may not access the possibility of obtaining certain deliveries or remain excluded.
The information obtained from Geolocation activities are included in a subject profiling process, aimed at obtaining automated decisions through the use of specific algorithms. In fact, precisely in this lies the discriminatory nature of the activities carried out by the company towards the riders, who are forced to undergo an automatic and often unfair management system.
Although, in fact, the sector provides for a general prohibition to subject an interested party to a decision based exclusively on automated processing, in this case it is an exceptional hypothesis, given by the execution of a contract stipulated between the interested party and the owner. But, in the use of automated systems and exploiting the information relating to the rider's geolocation, did Foodinho really respect his privacy? Apparently not, since from the investigations carried out "it does not appear that the company has taken steps to implement appropriate measures to" protect the rights, freedoms and legitimate interests of the data subject, at least the right to obtain human intervention [...], of express their opinion and contest the decision ".
In fact, what most interested the Guarantor - and what represents the most evident and dangerous violation for the subjects involved - is the use, by the digital platform of Foodinho Srl, of automated decision-making mechanisms, based on complex algorithms whose functioning is still partially unknown.
The Guarantor's halt on the discriminatory use of algorithms
Have you ever wondered on the basis of what is decided who will he be the rider who will perhaps deliver your dinner comfortably at home? The proximity to the place of collection of the order perhaps? No!The mechanism underlying this choice is based on a "system of excellence" which attributes to each rider, through the operation of specific and predetermined parameters, a score that gives priority access to the "system selection of the time slots established by the company, within which the orders received are distributed daily. "
The criteria used as "filters" of the decision then taken in an effectively automated way are 5, including the score assigned by the customer regarding the delivery experience with the rider or the number of orders actually delivered by the rider . Therefore, one of the factors that can determine the actual modification of the score is related to the degree of customer / user satisfaction obtained by the rider and this is certainly an important factor to take into consideration.
What happens, however, if the Does the algorithm that calculates a rider's rating take into consideration only the negative feedback received, without taking into account the positive ones? It could happen that a person who has made a high number of deliveries, in most cases receiving very positive reviews, finds himself being discriminated against for a single negative comment. All this, then, based on a system that not only profiles your personal data, but which places a completely automated mechanism at the basis of a fundamental decision for the execution of the employment relationship and that escapes human intervention! >
How to develop the GDPR compliance app: privacy bydesign and bydefault
Are you a rider too or do you want to start this new work experience? Are you a company that bases its core business on the home delivery service with the help of riders?It is clear, also following the experience that has seen the GlovoApp23 corporate group as protagonist, that pay little attention the profiles relating to the protection of workers' personal data - in the specific case rider - can lead to serious consequences both to the detriment of the interested party, who sees his sphere of rights and freedoms being infringed, and to the detriment of those who commit violations, being in that case exposed to heavy sanctions or corrective measures (the Guarantor has in fact imposed an administrative fine of 2.6 million euros on Foodinho Srl!).
If you have doubts about this or you need personalized advice for your case, contact the FCLEX partner firm, specialized in the sector, which will be able to offer you the degree of support necessary to avoid unpleasant consequences and carry out your business in a profitable and efficient manner. sso safe time.