Platform Economies: Cultural, Political and Work Futures

On 6th June the Social and Cultural Geographies Research Group at Northumbria University hosted our annual lecture. This year's theme was platforms capitalism and I presented alongside Yujie Chen from the University of Leicester and Jeremy Gilbert from the University of East London.

My presentation was based on a paper I've just done corrections on for Information, Communication and Society. The slides and notes from yesterday are below.

 Thankyous

Thankyous

 For the last few years i’ve been interested in new forms of patronage facilitated through online platforms. With British Academy and Leverhulme funding i examined the emergence of Patreon and Subbable, and their role in changing geographies of patronage networks.  These platforms - and newer ones such as drip and Memberful - provided cultural producers with new streams of income. They are used heavily by artists and creators who use the web for various parts of their production and distribution process. These artists and creators had faced falling income as a result of adblockers reducing advertising revenue, piracy drastically effecting the way people consume media online, and changes to monetisation  algorithms on sites like Youtube.

For the last few years i’ve been interested in new forms of patronage facilitated through online platforms. With British Academy and Leverhulme funding i examined the emergence of Patreon and Subbable, and their role in changing geographies of patronage networks.

These platforms - and newer ones such as drip and Memberful - provided cultural producers with new streams of income. They are used heavily by artists and creators who use the web for various parts of their production and distribution process. These artists and creators had faced falling income as a result of adblockers reducing advertising revenue, piracy drastically effecting the way people consume media online, and changes to monetisation  algorithms on sites like Youtube.

 Crowd-patronage is akin to crowdfunding used by cultural producers but differs in a series of important ways:  …some will use indiegogo or kickstarter to fund discrete projects like recording an album or producing a film.  Compared to historical forms of patronage which were dominated by rich and powerful individuals, families or institutions, artists and creators in crowd-patronage networks will have dozens, if not hundreds or thousands of patrons paying between $1 and $5 a month.  [see  here  for a paper I wrote focusing on this in more detail]

Crowd-patronage is akin to crowdfunding used by cultural producers but differs in a series of important ways:

…some will use indiegogo or kickstarter to fund discrete projects like recording an album or producing a film.

Compared to historical forms of patronage which were dominated by rich and powerful individuals, families or institutions, artists and creators in crowd-patronage networks will have dozens, if not hundreds or thousands of patrons paying between $1 and $5 a month.

[see here for a paper I wrote focusing on this in more detail]

 Patreon have been my main case study in this research.  Established in 2013. Based in San Francisco. They’ve been through four rounds of venture capital investment, raising $106m  There are about 100,000 artist using the platform with 2 million patrons. When i say artists, i’m using it as a catch all term to describe a whole host of activity such as these…  Patreon sells itself as being ‘creator first’, valuing art and artists, and their creativity above everything else.  In 2017 they facilitated payments from patrons to artist creators worth around $150 - that’s more than US National Endowment for the Arts.  There are now almost 70 users who receive over $10,000 a month, and the top users - Chapo Trap House - earn $97,000 a month. There are also a lot who don’t earn a great deal.  These are significant numbers, but there are other important developments from the way patreon operates…

Patreon have been my main case study in this research.

Established in 2013. Based in San Francisco. They’ve been through four rounds of venture capital investment, raising $106m

There are about 100,000 artist using the platform with 2 million patrons. When i say artists, i’m using it as a catch all term to describe a whole host of activity such as these…

Patreon sells itself as being ‘creator first’, valuing art and artists, and their creativity above everything else.

In 2017 they facilitated payments from patrons to artist creators worth around $150 - that’s more than US National Endowment for the Arts.

There are now almost 70 users who receive over $10,000 a month, and the top users - Chapo Trap House - earn $97,000 a month. There are also a lot who don’t earn a great deal.

These are significant numbers, but there are other important developments from the way patreon operates…

 [NB - missed a word on this slide. It should be Understanding platforms through the 'stack' and interpenetration]  Based on this research, today i want to explore how crowdpatronage platforms - Patreon in particular - act as intermediaries in the cultural and creative economy. In addition, i want to highlight how connections with other platforms has a massive influence on intermediary processes and change the very nature of art.  I’m going to start by talking a little bit about intermediaries, before two concepts which are helpful to understand platforms before delving into how Patreon and connected companies operate.

[NB - missed a word on this slide. It should be Understanding platforms through the 'stack' and interpenetration]

Based on this research, today i want to explore how crowdpatronage platforms - Patreon in particular - act as intermediaries in the cultural and creative economy. In addition, i want to highlight how connections with other platforms has a massive influence on intermediary processes and change the very nature of art.

I’m going to start by talking a little bit about intermediaries, before two concepts which are helpful to understand platforms before delving into how Patreon and connected companies operate.

 Ever since Bourdieu termed the phrase new cultural intermediaries, academics have sought to understand the role of different occupations involved in the presentation and representation of symbolic goods and services. Bourdieu didn’t give a lot of detail about cultural intermediaries and academics have done that thing we’re good at, filling in the gaps in various different ways and creating a rather confusing field.

Ever since Bourdieu termed the phrase new cultural intermediaries, academics have sought to understand the role of different occupations involved in the presentation and representation of symbolic goods and services. Bourdieu didn’t give a lot of detail about cultural intermediaries and academics have done that thing we’re good at, filling in the gaps in various different ways and creating a rather confusing field.

 The economies of qualities literature moves away from confusing occupational perspectives to focus on the network of agents influencing a consumer’s decisions to highlight those ‘which are invisible when the transaction is made, but without whom the attachment between the buyer and the then objectified and individualized new good could not have been tied’ (Musselin & Paradeise, 2005: n.p.).  These actors include Bourdieusian cultural intermediaries, those outside the cultural economy and even artists themselves taking on the functions of cultural intermediaries as a result of disintermediation (Kribs, 2016). Actors also include sociotechnical devices such as trading systems and protocols (Muniesa, Millo & Callon, 2007), pricing systems (Caliskan, 2007), communication technologies (Preda, 2006) and algorithms (McFall, 2014), which in the process of communicating information make quantitative and qualitative qualifications about products that influence purchasing decisions and consumption behaviours.  Widening the notion of who and what mediates products helps move away work that, to quote McFall “get[s] carried away with … symbolism, signification and taste-making at the expense of the more mundane work involved in market-making”. This work also opens up the possibility of cultural intermediation combined simultaneously with other forms of mediation - in a paper on crowd-patronage i wrote last year, i make the case that Patreon and similar platforms act simultaneously as cultural intermediaries, regulatory intermediaries and financial intermediaries.

The economies of qualities literature moves away from confusing occupational perspectives to focus on the network of agents influencing a consumer’s decisions to highlight those ‘which are invisible when the transaction is made, but without whom the attachment between the buyer and the then objectified and individualized new good could not have been tied’ (Musselin & Paradeise, 2005: n.p.).

These actors include Bourdieusian cultural intermediaries, those outside the cultural economy and even artists themselves taking on the functions of cultural intermediaries as a result of disintermediation (Kribs, 2016). Actors also include sociotechnical devices such as trading systems and protocols (Muniesa, Millo & Callon, 2007), pricing systems (Caliskan, 2007), communication technologies (Preda, 2006) and algorithms (McFall, 2014), which in the process of communicating information make quantitative and qualitative qualifications about products that influence purchasing decisions and consumption behaviours.

Widening the notion of who and what mediates products helps move away work that, to quote McFall “get[s] carried away with … symbolism, signification and taste-making at the expense of the more mundane work involved in market-making”. This work also opens up the possibility of cultural intermediation combined simultaneously with other forms of mediation - in a paper on crowd-patronage i wrote last year, i make the case that Patreon and similar platforms act simultaneously as cultural intermediaries, regulatory intermediaries and financial intermediaries.

 Lots of work, no time to examine it now, but two concepts are particularly useful for my research.

Lots of work, no time to examine it now, but two concepts are particularly useful for my research.

 Examining platforms at a more technical level, Choudary (2015) identifies a common platform architecture found across platform types that he calls the ‘platform stack’. It is made up of three layers: a network-marketplace-community layer consisting of users, their connections and activities; an infrastructure layer made up of the ‘tools, services and rules’ that enable platforms to function (ibid: 61); and a data layer for the collection and collation of data about and from users, and the nature of the connections between them.   Layer thickness varies between platform types depending on their purpose.   Really useful organising tool to help frame my work.

Examining platforms at a more technical level, Choudary (2015) identifies a common platform architecture found across platform types that he calls the ‘platform stack’. It is made up of three layers: a network-marketplace-community layer consisting of users, their connections and activities; an infrastructure layer made up of the ‘tools, services and rules’ that enable platforms to function (ibid: 61); and a data layer for the collection and collation of data about and from users, and the nature of the connections between them. 

Layer thickness varies between platform types depending on their purpose. 

Really useful organising tool to help frame my work.

 For the internet to function there needs to be interoperability - the same protocols and coding languages need to be used to websites can be accessed through browsers and so they can be connected together.  van Dijck (2013) draws on this idea, conceptualizing what she calls interpenetration. In her examination of social media platforms, she explains how interpenetration is made possible through technical linkages but also shared operational logics. The former is facilitated by tools, such as application program interfaces (APIs) that allow users to integrate functions of or data from one platform into other sites (e.g. embedding a YouTube video into Reddit). APIs are complemented by other tools, such as templates and plugins that allow users to embed code and links from one platform service into another. Together these tools extend a platform’s reach into other platforms and perpetuate the interpenetration of online ecosystems.

For the internet to function there needs to be interoperability - the same protocols and coding languages need to be used to websites can be accessed through browsers and so they can be connected together.

van Dijck (2013) draws on this idea, conceptualizing what she calls interpenetration. In her examination of social media platforms, she explains how interpenetration is made possible through technical linkages but also shared operational logics. The former is facilitated by tools, such as application program interfaces (APIs) that allow users to integrate functions of or data from one platform into other sites (e.g. embedding a YouTube video into Reddit). APIs are complemented by other tools, such as templates and plugins that allow users to embed code and links from one platform service into another. Together these tools extend a platform’s reach into other platforms and perpetuate the interpenetration of online ecosystems.

 Payment services is a good example of this. If you buy something online the likelihood is that the company you’re buying from won’t handle the transaction between you, your credit card provider and the the seller’s bank. It will be one of these providers. These a platforms in themselves which provide services to other platforms so the latter don’t need to worry about the complicated world of financial regulations.

Payment services is a good example of this. If you buy something online the likelihood is that the company you’re buying from won’t handle the transaction between you, your credit card provider and the the seller’s bank. It will be one of these providers. These a platforms in themselves which provide services to other platforms so the latter don’t need to worry about the complicated world of financial regulations.

 These are examples of technical interpenetration, but van Dijck also highlights interpenetration through shared operational logics.  If people are going to use your platform, it would be helpful if it operated in similar ways to other websites. The is the case for people just using the network layer of your platform stack, and for other platform companies using your services through technical interpenetration. Platforms don’t need to be operate in exactly the same way, but they should at least be aligned, have similar values and operate within the same web paradigm.  This results in particular trends emerging, and why you see them slightly different versions of the same functionality on different platforms: like buttons on Facebook, Youtube and Twitter, up vote buttons on Reddit and Imgur, share buttons on every news site, etc. These are indicators of interpenetration.

These are examples of technical interpenetration, but van Dijck also highlights interpenetration through shared operational logics.

If people are going to use your platform, it would be helpful if it operated in similar ways to other websites. The is the case for people just using the network layer of your platform stack, and for other platform companies using your services through technical interpenetration. Platforms don’t need to be operate in exactly the same way, but they should at least be aligned, have similar values and operate within the same web paradigm.

This results in particular trends emerging, and why you see them slightly different versions of the same functionality on different platforms: like buttons on Facebook, Youtube and Twitter, up vote buttons on Reddit and Imgur, share buttons on every news site, etc. These are indicators of interpenetration.

 van Dijck argues interpenetration is an important concept because it allows us to appreciate platforms as part of an eco-system. As she puts it…  Dissecting the cultural logic of intermediation in crowd-patronage networks, then, is the aim of the paper this presentation comes from.

van Dijck argues interpenetration is an important concept because it allows us to appreciate platforms as part of an eco-system. As she puts it…

Dissecting the cultural logic of intermediation in crowd-patronage networks, then, is the aim of the paper this presentation comes from.

 What i want to do is to delve into the stack and examine the intermediaries functions which take place within different layers and illustrate how interpenetration with other platforms influences these functions.

What i want to do is to delve into the stack and examine the intermediaries functions which take place within different layers and illustrate how interpenetration with other platforms influences these functions.

 The first example is very simple - i go into more detail in the paper, but that isn’t necessary here.  Patreon and Subbable’s primary intermediary function is facilitation of payments between patrons and artist-creators, but they don’t handle the transactions themselves. As part of the infrastructure layer, payment processing is done by a third-party platform who charge a transaction fee.

The first example is very simple - i go into more detail in the paper, but that isn’t necessary here.

Patreon and Subbable’s primary intermediary function is facilitation of payments between patrons and artist-creators, but they don’t handle the transactions themselves. As part of the infrastructure layer, payment processing is done by a third-party platform who charge a transaction fee.

 Stripe provides payment processing for Patreon …And this is just some of the companies that Stripe provides services for. This is part of the wider ecosystem van Dijck talks about to which Patreon belongs through technical interpenetration with their payment provider. All these companies are connected through Stripe, and therefore share some alignment in their infrastructure layer with each other through the payment processor.

Stripe provides payment processing for Patreon …And this is just some of the companies that Stripe provides services for. This is part of the wider ecosystem van Dijck talks about to which Patreon belongs through technical interpenetration with their payment provider. All these companies are connected through Stripe, and therefore share some alignment in their infrastructure layer with each other through the payment processor.

 As well as acting as a financial intermediary, Patreon also acts as a regulatory intermediary. Much like many other content-driven platforms such as Instagram and Youtube, they use terms of service and community guidelines to regulate content and to ensure compliance with the legal requirements of the country in which they operate. For Gillespie, these rules perform discursive work as well as help to police content and behaviours, and ‘therefore reveal in oblique ways, how platforms see themselves as public arbiters of cultural value’ (Forthcoming: 14). A useful illustration of this is how Patreon and other platforms regulate nudity.

As well as acting as a financial intermediary, Patreon also acts as a regulatory intermediary. Much like many other content-driven platforms such as Instagram and Youtube, they use terms of service and community guidelines to regulate content and to ensure compliance with the legal requirements of the country in which they operate. For Gillespie, these rules perform discursive work as well as help to police content and behaviours, and ‘therefore reveal in oblique ways, how platforms see themselves as public arbiters of cultural value’ (Forthcoming: 14). A useful illustration of this is how Patreon and other platforms regulate nudity.

 Signalling their commitment to creators, Patreon’s community guidelines state: 

Signalling their commitment to creators, Patreon’s community guidelines state: 

 Deviant Art’s terms are much more open to interpretation and potentially more restrictive:

Deviant Art’s terms are much more open to interpretation and potentially more restrictive:

 Twitter, in contrast, takes a slightly different approach, allowing pornography and nudity to be uploaded. They state…  They put the emphasis on consumers to do the filtering rather than the producers, and offer options to hide sensitive materials.  These vignettes are useful when one considers the interpenetration of platforms through operational logics - platforms need to be aligned, have similar values and operate within the same web paradigm. Creators on Patreon frequently use these platforms for distribution and marketing purposes, so although Patreon’s guidelines are relatively broad and open to artistic expression, content linked from other platforms falls under different rules.   Furthermore, where platforms user rely on third parties for key processes – payments for example – another set of guidelines and limitations becomes enrolled through technical interpenetration as the services connect to the infrastructure layer.   

Twitter, in contrast, takes a slightly different approach, allowing pornography and nudity to be uploaded. They state…

They put the emphasis on consumers to do the filtering rather than the producers, and offer options to hide sensitive materials.

These vignettes are useful when one considers the interpenetration of platforms through operational logics - platforms need to be aligned, have similar values and operate within the same web paradigm. Creators on Patreon frequently use these platforms for distribution and marketing purposes, so although Patreon’s guidelines are relatively broad and open to artistic expression, content linked from other platforms falls under different rules. 

Furthermore, where platforms user rely on third parties for key processes – payments for example – another set of guidelines and limitations becomes enrolled through technical interpenetration as the services connect to the infrastructure layer.

 

 PayPal’s terms and conditions, for example, prohibit use of its service for ‘items that are considered obscene...[and] certain sexually oriented materials or services’.  The vagueness of this clause makes it open to interpretation, and in 2014 it resulted in PayPal stopping patrons from using its service to support artist-creators producing adult content. To stop users’ money from being frozen, Patreon had to change the URLs of all NSFW artist-creators and make their pages private, and patrons using PayPal to support these artist-creators had to switch to pledging with credit cards. Because of a clash of operational logics, Patreon had to comply with PayPal’s rules and regulations even through there wasn’t any technical interpenetration.

PayPal’s terms and conditions, for example, prohibit use of its service for ‘items that are considered obscene...[and] certain sexually oriented materials or services’.

The vagueness of this clause makes it open to interpretation, and in 2014 it resulted in PayPal stopping patrons from using its service to support artist-creators producing adult content. To stop users’ money from being frozen, Patreon had to change the URLs of all NSFW artist-creators and make their pages private, and patrons using PayPal to support these artist-creators had to switch to pledging with credit cards. Because of a clash of operational logics, Patreon had to comply with PayPal’s rules and regulations even through there wasn’t any technical interpenetration.

 In 2016, however, PayPal’s decision was reversed after Patreon negotiated with PayPal and assured them:  Patreon was able to influence PayPal after it had grown during the intervening years. It was turning over more revenue, was growing quickly, at the time it was in its third round of venture capital investment and had gained a reputation as the leading crowd-patronage platform.  Patreon proved it was able to act as a regulator of its own user’s content and could therefore act as a regulator for PayPal as well. In effect, Patreon’s role as a regulatory intermediary allowed the realignment of operational logics of the two companies through the former’s terms of use.  Following, van Dijck’s call to focus on connection between platforms and given the importance of regulating and moderating comments, photos and videos uploaded to platforms, it is crucial that we don’t just focus on a single platform. We should consider the multi-sided interconnections between them. As this examples illustrates, interpenetration through users can lead to interpenetration of regulation across platform ecosystems.  But as platforms become increasingly aligned, there are potential impacts for the different activities platforms are produced for. This is what this final example illustrates.

In 2016, however, PayPal’s decision was reversed after Patreon negotiated with PayPal and assured them:

Patreon was able to influence PayPal after it had grown during the intervening years. It was turning over more revenue, was growing quickly, at the time it was in its third round of venture capital investment and had gained a reputation as the leading crowd-patronage platform.

Patreon proved it was able to act as a regulator of its own user’s content and could therefore act as a regulator for PayPal as well. In effect, Patreon’s role as a regulatory intermediary allowed the realignment of operational logics of the two companies through the former’s terms of use.

Following, van Dijck’s call to focus on connection between platforms and given the importance of regulating and moderating comments, photos and videos uploaded to platforms, it is crucial that we don’t just focus on a single platform. We should consider the multi-sided interconnections between them. As this examples illustrates, interpenetration through users can lead to interpenetration of regulation across platform ecosystems.

But as platforms become increasingly aligned, there are potential impacts for the different activities platforms are produced for. This is what this final example illustrates.

 The final example i want to examine is curatorial intermediation and the transformations necessary for platforms to undertake this.  Within patronage networks, cultural intermediaries have long played curatorial roles as they help shape tastes and trends, informing people of what art is good, valuable and worthwhile buying or seeing. We can see platforms doing this as they make recommendation of who to follow on instagram, which programmes to watch on Netflix and in the case of patreon, which people to support.  To curate content, information from the data layer of the platform stack is mobilised through the infrastructure layer to alter the experience of users in the network layer. For this to happen, however, a series of important transformations have to occur that enrol artist-creators and their work into a calculus of web metrics. This system is part of wider operational logic for online platforms which dates back to Web 1.0, where hits were key indicators of a website’s quality (Rogers, 2002), and has evolved as major platforms attempt to imbue the web with increased sociality through ‘likes’ and similar mechanisms to monitor user behavior (Gerlitz and Helmond, 2013). Platforms use this information about what users like, what they viewed, how long they viewed it for and how they rated it to enhance other layers.  The precise ways platform companies undertake these processes differs, but they are aligned through operational logics which value metrics as the best way to handle the huge amounts of data they collate.

The final example i want to examine is curatorial intermediation and the transformations necessary for platforms to undertake this.

Within patronage networks, cultural intermediaries have long played curatorial roles as they help shape tastes and trends, informing people of what art is good, valuable and worthwhile buying or seeing. We can see platforms doing this as they make recommendation of who to follow on instagram, which programmes to watch on Netflix and in the case of patreon, which people to support.

To curate content, information from the data layer of the platform stack is mobilised through the infrastructure layer to alter the experience of users in the network layer. For this to happen, however, a series of important transformations have to occur that enrol artist-creators and their work into a calculus of web metrics. This system is part of wider operational logic for online platforms which dates back to Web 1.0, where hits were key indicators of a website’s quality (Rogers, 2002), and has evolved as major platforms attempt to imbue the web with increased sociality through ‘likes’ and similar mechanisms to monitor user behavior (Gerlitz and Helmond, 2013). Platforms use this information about what users like, what they viewed, how long they viewed it for and how they rated it to enhance other layers.

The precise ways platform companies undertake these processes differs, but they are aligned through operational logics which value metrics as the best way to handle the huge amounts of data they collate.

 The first transformation is the redefinition of art into content and artists into content creators. The term ‘content’ is partly a semantic shift which fits the lexicon of web development and the need for sites to be filled with ‘content’, but it can be problematic for some who see it as devaluing their professional skill, judgement and expression.  Journalists…  as one participant put it: 

The first transformation is the redefinition of art into content and artists into content creators. The term ‘content’ is partly a semantic shift which fits the lexicon of web development and the need for sites to be filled with ‘content’, but it can be problematic for some who see it as devaluing their professional skill, judgement and expression.

Journalists…

as one participant put it: 

 This allows for a second transformation, where artistic value is transformed from qualitative appreciation and emotional reactions into quantifiable metrics that can be used to curate a website’s content or individual cultural producers using calculative devices. We’ve seen this done in television rating systems for decades where viewing figures are seen as an indication of quality.

This allows for a second transformation, where artistic value is transformed from qualitative appreciation and emotional reactions into quantifiable metrics that can be used to curate a website’s content or individual cultural producers using calculative devices. We’ve seen this done in television rating systems for decades where viewing figures are seen as an indication of quality.

 But for Gerlitz and Helmond, the quantification of online activity through metrics hides “a variety of affective responses such as excitement, agreement, compassion, understanding, but also ironic and parodist liking” behind a simple click. The outcomes of these transformations can be profound, potentially changing the nature of art as it is enrolled into platform ecosystems. Let’s examine how this happens on Youtube… this is important to understand as Youtube is the biggest media platform and many Patreon artists use it to host their work. It is therefore important Patreon is aligned with it.

But for Gerlitz and Helmond, the quantification of online activity through metrics hides “a variety of affective responses such as excitement, agreement, compassion, understanding, but also ironic and parodist liking” behind a simple click. The outcomes of these transformations can be profound, potentially changing the nature of art as it is enrolled into platform ecosystems. Let’s examine how this happens on Youtube… this is important to understand as Youtube is the biggest media platform and many Patreon artists use it to host their work. It is therefore important Patreon is aligned with it.

 YouTube’s data layer is mature and thick, but that doesn’t necessarily lead to more sophisticated curation.  A key calculative device for YouTube in this process is ‘watch time’, defined as ‘[t]he amount of time that a viewer has watched a video’. Participants familiar with this metric explained that is it is more complex than that:

YouTube’s data layer is mature and thick, but that doesn’t necessarily lead to more sophisticated curation.

A key calculative device for YouTube in this process is ‘watch time’, defined as ‘[t]he amount of time that a viewer has watched a video’. Participants familiar with this metric explained that is it is more complex than that:

 Patreon take a slightly different approach - they value artistic freedom, and therefore try to avoid such reductive measures, but they still rely on metrics to curate content. They do this in a ‘featured’ section on the website for everyone, and recommendations made to individuals. Both these methods use algorithms to aid curation and make value judgements in the process, but in different ways. I assumed both would involve incredibly complex code and delve deep into multiple databases. One is based on code working between the infrastructure and data layers, but the other done by a human being called Dave, working in the Patreon offices in San Francisco.  [that's not really Dave, that's stock image Dave]  [NB - these were they curation processes during 2016, they have changed since]

Patreon take a slightly different approach - they value artistic freedom, and therefore try to avoid such reductive measures, but they still rely on metrics to curate content. They do this in a ‘featured’ section on the website for everyone, and recommendations made to individuals. Both these methods use algorithms to aid curation and make value judgements in the process, but in different ways. I assumed both would involve incredibly complex code and delve deep into multiple databases. One is based on code working between the infrastructure and data layers, but the other done by a human being called Dave, working in the Patreon offices in San Francisco.

[that's not really Dave, that's stock image Dave]

[NB - these were they curation processes during 2016, they have changed since]

 Confidentiality disallows inclusion of the full algorithmic procedures, but recommendations are generated from two processes which look for similar patterns of likes between patrons and generates recommendations through various filters.  A user will see five recommendations at one time on their profile page which will be refreshed from the pre-stored bank of recommendations the algorithm has generated. Judgements are written into these procedures based on various assumptions, but none of these relate to aesthetics, taste or artistic value - like Youtube it is various measures of popularity. Humans have made these decisions to align these procedures and the platform’s functionality with other platforms: metrics wins the day.

Confidentiality disallows inclusion of the full algorithmic procedures, but recommendations are generated from two processes which look for similar patterns of likes between patrons and generates recommendations through various filters.

A user will see five recommendations at one time on their profile page which will be refreshed from the pre-stored bank of recommendations the algorithm has generated. Judgements are written into these procedures based on various assumptions, but none of these relate to aesthetics, taste or artistic value - like Youtube it is various measures of popularity. Humans have made these decisions to align these procedures and the platform’s functionality with other platforms: metrics wins the day.

 The ‘featured’ pages on patreon.com offer users a selection of creators to explore. The list is updated periodically, but there is no individualisation in what users see. The process of generating this section involves a simple procedural algorithm which is executed by a member of Patreon staff: Dave. It begins when creators nominate themselves to be featured by completing a simple web form, then Dave removes artist-creators producing adult material, checks for fake or abandoned accounts and then adds them to a database which updates the featured page.   Even though this curation is being undertaken directly by a human, there are no artistic value judgements made about the quality of a creator’s work, their significance or their potential. The overriding principle is to ensure that this prominent part of the website does not include content which may deter users, while exposing creators to new audiences. Here we can begin to see human behviours starting to become aligned with web operational logics.

The ‘featured’ pages on patreon.com offer users a selection of creators to explore. The list is updated periodically, but there is no individualisation in what users see. The process of generating this section involves a simple procedural algorithm which is executed by a member of Patreon staff: Dave. It begins when creators nominate themselves to be featured by completing a simple web form, then Dave removes artist-creators producing adult material, checks for fake or abandoned accounts and then adds them to a database which updates the featured page. 

Even though this curation is being undertaken directly by a human, there are no artistic value judgements made about the quality of a creator’s work, their significance or their potential. The overriding principle is to ensure that this prominent part of the website does not include content which may deter users, while exposing creators to new audiences. Here we can begin to see human behviours starting to become aligned with web operational logics.

 To conclude…The shift to algorithmic curation signals a move away from traditional curatorial expertise and methodologies to a metricised approach, where the role of human curators is no longer about direct appraisal, selection and recommendation of art based on expertise. But this isn’t demoncratisation. Instead, data analysts and authors of algorithmic procedures, together with major platforms become central to the process. They and their devices can have profound and wide reaching impacts for art and individuals.  The effects of this can already been seen online where creators are producing media not because of its inherent value, but because it is similar to other popular work and therefore will generate hits and likes and raise their profile.  So, as crowd-patronage and media platforms act as intermediaries, as their processes increasingly align, and as they use calculative devices to mediate content, we are witnessing a change in what art is online. The alignment of platforms through interpenetration constrains what is valued, how it is valued and although crowd patronage offer ways for artists to find new streams of income it enrols them into calculi of web metrics that potentially undermines what they do.  There is a generation of culture critics who don’t see this as a problem because they don’t appreciate the quality of art being produced for online distribution. But these approaches are spreading offline too. The most prominent of which is the Arts Council who want to use metrics to judge the quality of work they funded.  Thinking beyond art, there are implications for what it means to be human within platform ecosystems and as people are judged through metrics.

To conclude…The shift to algorithmic curation signals a move away from traditional curatorial expertise and methodologies to a metricised approach, where the role of human curators is no longer about direct appraisal, selection and recommendation of art based on expertise. But this isn’t demoncratisation. Instead, data analysts and authors of algorithmic procedures, together with major platforms become central to the process. They and their devices can have profound and wide reaching impacts for art and individuals.

The effects of this can already been seen online where creators are producing media not because of its inherent value, but because it is similar to other popular work and therefore will generate hits and likes and raise their profile.

So, as crowd-patronage and media platforms act as intermediaries, as their processes increasingly align, and as they use calculative devices to mediate content, we are witnessing a change in what art is online. The alignment of platforms through interpenetration constrains what is valued, how it is valued and although crowd patronage offer ways for artists to find new streams of income it enrols them into calculi of web metrics that potentially undermines what they do.

There is a generation of culture critics who don’t see this as a problem because they don’t appreciate the quality of art being produced for online distribution. But these approaches are spreading offline too. The most prominent of which is the Arts Council who want to use metrics to judge the quality of work they funded.

Thinking beyond art, there are implications for what it means to be human within platform ecosystems and as people are judged through metrics.

On the use of lecture capture software

During the recent bad weather a student who was unable to make it to a lecture asked if I could record it and make it available through Panopto, the lecture capture platform Northumbria is subscribed too. I've been thinking about this a lot recently and below is an extended version of the email i sent the student with the reasons why myself and colleagues are not using it very much.


Dear Student

Lecture capture software is controversial at the moment. While it can act as a useful tool there are a series of reasons why staff aren’t using it very much, and until these issues are resolved myself and colleagues are not using it. The arguments are below.

Pedagogically, Panopto and similar systems are seen as a useful learning tool because it allows students to revisit a lecture. For science subjects where there is a correct answer, or in medicine there you want to be sure students learnt everything, this makes sense. In discursive subjects, such as a human geography, where value is placed on arguments, a system where students focus too much on what is said in lectures can lead to too narrow an understanding of the material. A much wider understanding come from students reading around the topic and developing their own argument based on a variety of material, ideally going beyond what is covered in a lecture and is on a reading list. We don’t want to reinforce this narrow mode of learning where students come to understand lectures as the key site for learning. You’ll see in assignment outlines that repeating what was in a lecture isn’t going to get very good marks. We want to foster independent learning and critical thinking skills so when you graduate and won’t necessarily have someone instructing you, you have the ability to develop your own ideas.

Related to the previous point, there is a concern that recording a lecture will see attendances drop. Dedicated students will likely still attend, but those with a different attitude may decide to miss classes but then not catch up. Data on viewings of online courses (MOOCs) shows that a very small percentage of students watch all the material.

Recording a lecture also have the potential to change the nature of the session. Staff may become more self-conscious and therefore everyone suffers because the information is not communicated as effectively as it might have been. Giving a lecture to a class of students whose attention you can track, and whose understanding you can check, is different to providing distance learning materials. Having students in the room allows a lecturer to reiterate points, expand on, or even skip elements depending on how well the class is following. This process also helps lecturers get better at knowing what works and what doesn’t.

Some staff may object to being recorded for personal reasons. They may have had bad experiences in the past, recordings may have been misused or there might be mental health issues acerbated through the recording process. There is an increasing audit culture within academia and while peer observation can be a useful process, knowing that management may login and watch your teaching without you knowing may negatively influence performance management, development opportunities and promotion chances.

Hopefully you have noticed that colleagues and myself like to make lectures interactive, involving the class by asking questions, having discussions and doing exercises. The feedback we get from students through various mechanism is that this makes lectures better and more enjoyable. But this can be difficult to do when students are nervous about contributing and we are acutely aware that we don’t want to make people feel uncomfortable. This is why we ask for volunteers to answer questions rather than calling on individuals who may suffer from anxiety or are not comfortable speaking in front of such a big class. If students know a lecture, and therefore their contribution to a lecture, is being recorded, it’s going to change the nature of interaction, potentially stopping it from happening at all. That is something we want to avoid for pedagogic and mental health reasons. Turning the recording system on and off isn’t really an option as that will impact the flow of a lecture in other ways.

There is currently no agreement between the University and the University and College Union about how lecture recordings can and should be used. There is a fear than once a set of lectures have been recorded the lecturer is no longer needed and can be asked to do other things, or worse sacked. You’re probably aware Newcastle University staff have been on strike this week and they have an agreement that lecture recordings can only be made available to students to whom that lecture was delivered. However, there are suggestions this agreement has been breached and lectures recorded last year have been made available to students this year while staff have been protesting. This fundamentally undermines the right to withdraw labour and threatens the position of staff. This is obviously a very serious issue and until a formal agreement has been reached many staff are reluctant to use Panopto.

In relation to agreements, there needs to be more clarity about who owns the performance element of a lecture recording. At the moment universities own materials we produce for lectures, but what ownership they have over our delivery is unclear. By recording lectures there could be an assumption universities own our performance copyright without formal agreements about this. An institution may seek additional value from a module by making the materials available through Panopto to students on other degrees or even in other countries through franchise agreements. We would like to think our modules are interesting and of use to other students, but they are not for distant learning. Developing those kinds of materials is difficult and should be done, and rewarded properly.

edit:

We discussed this issue in the staff-student consultation committee (SSCC) last week and Kevin Glynn made a great additional point. The ability to take notes and to actively listen is an incredibly important skill to learn. Kevin explained that this is a skill which is especially useful post-university when in meetings, talking to colleagues or clients/customers, at events etc when you're told new information. This is a skilled practiced in lectures and if you know a lecture is recorded and you can re-listen, it is tempting to switch off and stop active listening.

The debate in SSCC was very constructive and the student reps completely understood our hesitation in recording everything. We also discussed the idea that if there is something which could reasonably be video captured (taking into account all the above) there should be a minimum attendance level before session is recorded.

Memoryscapes Press Release

Memories brought to life in city arts project

Northumbria University has received a £60,000 grant to help develop new ways for people to share and access heritage and memories of significant events in Newcastle city centre.

Working in partnership with Tyne and Wear Archives and Museums and FaulknerBrowns Architects, the University will investigate how museum artefacts or historical events can be brought to life with input from members of the public, using technology in settings in and around the city.

 Northumberland Street, 1915 from  Newcastle Libraries Collection

Northumberland Street, 1915 from Newcastle Libraries Collection

 

The nine-month long research project has been jointly-funded by the Arts & Humanities Research Council (AHRC) and the Engineering and Physical Sciences Research Council (EPSRC) as part of a national scheme worth £1.88 million to explore the future of immersive experiences. 

It will see academics from Northumbria’s Geography, Architecture, Computing and Humanities departments share their expertise in heritage studies, urban design, virtual environments, human-computer interaction and participatory methodologies with architects, urban planners, artists and experts from the North East’s digital innovation cluster.

They will produce a development framework and prototype ideas that will help the creation of immersive experiences and memory-based connections with the past in public spaces, referred to as ‘memoryscapes’.

Lead researcher, Dr Jon Swords, a Senior Lecturer in Economic Geography at Northumbria University, explained: “One of the problems we are seeking to address is that so many of our museum collections are housed in archives and are rarely seen. 

“We will be working with Tyne and Wear Archives and Museums to understand what assets they have and assess how we can best use them to create memoryscapes in public spaces. By bringing them to life outside of their usual contexts, we can encourage members of the public to add their own memories, to create new narratives.

“At this stage, the actual memoryscapes that we’ll explore are to be decided, but an example of what we could do is to project footage of historical events, such as the 1952 FA Cup Final, or recreations of places such as Stephenson’s works, into public spaces. Using virtual and augmented reality technologies, we can enable the public to become immersed in these moments and encourage them to participate by adding or uploading their own memories and responses.”

The project will coincide with the Great Exhibition of the North, providing a unique opportunity to engage with the public on ways to bring these memoryscapes to life. 

It will also support the work FaulknerBrowns is undertaking with Newcastle City Council on a major scheme to regenerate the area around Northumberland Street. This research will inform part of their planning on what could be delivered in the newly modelled city centre.

Dr Swords added: “We have some outstanding digital businesses in the North East of England. They are at the forefront of some amazing new technologies but what they don’t always have is the content to utilise them to their maximum potential. 

“This project will bring together the vast collections of Tyne and Wear Archives and Museums with the excellent skills of our digital sector; the academic expertise of the University and the planning and development teams at FaulknerBrowns and Newcastle City Council who are designing the new spaces within the city centre, to create some truly innovative experiences for the public.” 

Lindy Gilliland, Manager of Collections and Research at Tyne & Wear Archives & Museums, said: “We care for some of the region’s most historically significant collections which comprise fine and decorative art, science and technology, natural sciences, costume, archaeology, world cultures, military and social history. We also hold the region’s archives, which are a tremendous resource for investigating local and regional history. We’re hoping that the Memoryscapes project will encourage more public access to both everyday and internationally significant objects and documents.”

Tania Love, Director at of FaulknerBrowns Architects, said: “We have been working with Newcastle City Council and NE1, preparing a series of interventions to reinvigorate the Northumberland Street Area to help position Newcastle as a vital, regional European capital. The masterplan aims to make the area a more vibrant, attractive and inclusive destination. This exciting Memoryscape project could helpfully contribute to this ambition and lead to some highly innovative and interactive experiences for residents, workers and visitors alike.”

The project forms part of Northumbria University’s multi-disciplinary research into Digital Living and how digital technology can be used to make the experience of living and working in a city better. The University is bringing together world-leading experts in artificial intelligence, information processing and modelling, architecture, built environment and human-computer interaction to explore the future of the human-centred smart city.

For more information on the project, visit www.northumbria.ac.uk/memoryscapes or follow Memoryscapes on Twitter.

Memoryscapes

Re-imagining place through immersive and participatory experiences that re-contextualise memory assets

Colleagues and I have awarded funding from the AHRC-EPSRC Immersive Experiences call. As the call document outlines, the focus of the funding round is on:

"...encouraging research proposals exploring immersive experiences in three areas where the UK has world leading creative assets and technology partners [memory, place and performance]. These three areas have arts and humanities research at the core of developing experiences and practices. They also represent areas in which the benefits of research offer significant cultural and economic value for the UK."

The immersive industry is built around the use of a range of technologies including virtual and augmented reality, 3D audio effects, haptic technologies, machine olfaction, gesture and speech recognition, and bespoke software.

 Northumberland Street, Newcastle upon Tyne. (From  Newcastle City Library Photographic Collection )

Northumberland Street, Newcastle upon Tyne. (From Newcastle City Library Photographic Collection)

Project Introduction

Our project seeks to develop a new framework to support the creation and application of immersive memoryscapes: multi-sensory and participatory experiences within public spaces that re-contextualise heritage assets, and reimagine and reinvigorate public spaces as destinations. These will provide connections with the past along with the capacity for users to contribute feedback and their own memories.

Our framework will combine academic understandings of the construction of these memoryscapes with practical guidelines for their creation and application. It will be scalable and offer not only new pathways for memory based organisations to disseminate their collections, but provide new approaches to enhance urban (re)development projects through the inclusion of immersive and participatory experiences. Through a series of interviews, desk-based research, collaborative workshops and public engagement, we will explore and evaluate ideas, challenges and opportunities for immersive experiences that employ memory assets to reinvigorate place as a destination. By examining the intersection of immersive experiences, memory assets and place, the proposed research aims to establish the potential application of immersive experiences to:

  • re-contextualise and increase access to, and dialogue about memory assets by bringing them out of the museum/gallery/archive and presenting them in new ways and in new locations
  • reimagine and reinvigorate our public spaces contributing to their character and identity, and our relationship to those places by utilising memory assets

Our key outputs will be:

  • New interdisciplinary partnerships which can go forward to the next round of funding to develop immersive experiences
  • A framework for understanding the generation of immersive and participatory memoryscapes, including 'prototype(s)' to illustrate potential ways forward

To help us we're working with two project partners:

Partner 1 - Tyne and Wear Archives and Museums

Tyne and Wear Archives and Museums is a major regional museum, art gallery and archives service based in North East England. They operate nine museums, support a further 55 and manage the region’s archives. Their collections are of international importance in archives, art, science and technology, archaeology, military and social history, fashion and natural sciences. TWAM will provide valuable insights and access to their collections, as well as expertise on user experience and curation of pasts.

Partner 2 - FaulknerBrowns Architects

FaulknerBrowns are a multi-award winning architectural practice with over 50 years of experience working nationally and internationally. They are recognized for their design, masterplanning, placemaking and strategic expertise, and have worked on multi-million pound projects for public and private sector clients. FaulknerBrowns bring to the project extensive research from their collaboration with Newcastle City Council on masterplanning the development of Newcastle’s principal retail area.

We'll also be working with immersive experience providers (including VRTGO members), urban designers, heritage organisations, civic bodies, artists, designers and academics

The project lasts for nine months and will include a series of workshops to develop our outputs. If you're interested in this project we'll have a dedicated twitter and website up and running soon. In the meantime feel free to contact me.

Project team

Jon Swords (Principle Investigator) - Senior Lecturer in Economic Geography, Dept of Geography and Environmental Sciences, Northumbria University

Richard Watson (Co-Investigator) - Senior Research Fellow, Dept of Architecture and Built Environment, Northumbria University

James Charlton (Co-Investigator) - Lecturer in Architecture, Dept of Architecture and Built Environment, Northumbria University

Claire Nally (Co-Investigator) - Senior Lecturer in Twentieth-Century English Literature, Dept of Humanities, Northumbria University

Kay Rogage (Co-Investigator) - Vice-Chancellor’s Research Fellow in Digital Living, Dept of Computer and Information Sciences, Northumbria University

David Kirk (Co-Investigator) - Professor of Digital Living, Dept of Computer and Information Sciences, Northumbria University