Category Archives: Mobile Services

How to Onboard in Open Source Projects?

Onboarding is a process that helps newcomers become integrated members of their organisation. Successful onboarding programs can result in increased performance in conventional organisations, but there is little guidance on how to onboard new developers in Open Source Software (OSS) projects.

Bildschirmfoto 2015-12-27 um 15.12.34

[blue = mentored, red = non-mentored]

In several studies, we examined how mentoring and project characteristics influence the effectiveness and efficiency of the onboarding process.

Recommendations for project leaders and managers:

  • Identify core developers who can spend a limited time on intensive mentoring. Provide direct incentives for mentoring. For example, the opportunity to get help for pending tasks can be attractive for potential mentors. Clearly limiting the duration of mentoring reduces the negative effect on the mentor’s performance in other project tasks and can reduce some of the resistance to participate.
  • Organize or sponsor collocated events, such as Hackathons, and use them to kick off the mentoring period. Face-to-face events can help team members and mentors to focus on problems which are difficult to overcome in a distributed setting, and can further boost the success of onboarding new members into virtual teams. Many open source projects already arrange periodic collocated events and welcome participation by newcomers. Engaging with these provides direct access to the project community.
  • Expect considerable variation in performance increases over time. Assessing the cost and outcomes of mentoring requires understanding onboarding as a learning process which does not proceed linearly. Some onboarding activity will not be publicly visible. Engage directly with mentors and newcomers to gain insight of how onboarding is progressing.
  • Adapt the onboarding program to project characteristics and culture. Take the maturity of the target project and its existing onboarding practices into account. Low-maturity projects may require more support to instill a productive mentoring culture, while mature projects may already have an existing culture of integrating new developers and may be ready for tailoring towards more specific inclusion targets.

Keywords: Onboarding, Organisational Socialisation, Open Source Software, Case Study, Mentoring, Software Teams, Distributed Development

Articles:

Read more about the general effect of onboarding support on newcomer activity and the moderating effect of project characteristics, such as age, number of contributors, and appeal, on the speed of the onboarding process:

  • [PDF] [DOI] Fabian Fagerholm, Alejandro S. Guinea, Jürgen Münch, Jay Borenstein. The Role of Mentoring and Project Characteristics for Onboarding in Open Source Software Projects. In Proceedings of the 8th ACM-IEEE International Symposium on Software Engineering and Measurement (ESEM 2014), Torino, Italy, September 2014.
    [Bibtex] [doi] [pdf]
    @inproceedings{ESEM2014B,
    author = {Fabian Fagerholm, Alejandro S. Guinea, Jürgen Münch, Jay Borenstein}, 
    title = {The Role of Mentoring and Project Characteristics for Onboarding in Open Source Software Projects}, 
    booktitle = {Proceedings of the 8th ACM-IEEE International Symposium on Software Engineering and Measurement (ESEM 2014)},
    year = {2014},
    month = {September},
    doi={10.1145/2652524.2652540},
    address = {Torino, Italy}

Read more about the developer activity during onboarding, the potential cost of mentoring in terms of lost productivity, and find guidelines for using mentoring as an onboarding support mechanism.

  • [PDF] [DOI] Fabian Fagerholm, Alejandro Sanchez Guinea, Jay Borenstein, Jürgen Münch. Onboarding in Open Source Projects. IEEE Software, 31(6):54-61, 2014.
    [Bibtex] [doi] [url] [pdf]
    @ARTICLE{software14, 
    author={Fabian Fagerholm, Alejandro Sanchez Guinea, Jay Borenstein, Jürgen Münch}, 
    journal={IEEE Software}, 
    title={Onboarding in Open Source Projects}, 
    year={2014}, 
    volume = {31},
    number = {6},
    publisher = {IEEE},
    URL = {http://www.computer.org/csdl/mags/so/2014/06/mso2014060054-abs.html},
    pages = {54-61},
    doi = {10.1109/MS.2014.107},
    keywords={open source software projects; virtual teams; mentoring; global software development; distributed software development; case study}}

How to Find out Which Features to Implement in Popular Smartphone Apps?

In the world of smartphone applications, there is a rich set of user feedback and feature suggestions which are updated continuously. Especially for smaller development teams, prioritizing these requests is of utmost importance. The prioritization is usually done based on techniques which rely on methods that are built upon predictions and customer interviews such as “Would you buy that feature?”. However, predictions can be wrong and customer interviews suffer from contextual biases.

We developed an approach that allows us to find out about the real business value of a feature. The approach is based on mock purchases and allows product managers and developers to depict the real business value of a feature without having to implement it. Hence, the approach allows feature prioritization based on facts rather than on predictions. The rationale behind the approach is to eliminate contextual biases. On top of that, the approach allows us to experiment with the feature pricing.

approach

Figure 1. Approach.

Figure 1 depicts six elements of the approach:

Product Backlog: Each user is assigned only one of the features from the backlog that are to be tested. Thereby, potential contextual biases can be reduced to a minimum.
Description Page: For each feature, a custom description page is created which describes the feature and contains a button to load the feature’s price.
Purchase Page: After loading the price, the amount is displayed on the page and a purchase button appears.
Acknowledgement Page: After purchasing, the user receives an acknowledgement about the feature not having been implemented yet. Moreover, the user is then asked whether he or she likes, dislikes, or very much dislikes the approach. On top of that, he or she can provide a custom message in case he or she needs the feature urgently or wants to complain.
Questionnaire Page: In case the user cancels the purchase (i.e. wants to move away from the page after having loaded the price), he or she is asked why he or she did so. Users can choose that they are not interested in the feature, it is too expensive, they were disappointed by other in-app purchases, or they do not spend money on apps. They can also provide a custom message.

The study was implemented in an app called Track My Life which is used by Nokia CEO Stephen Elop and the Finnish Minister for European Affairs, Alexander Stubb amongst others. The app is a GPS Tracker that automatically collects the users’ location information in the background and analyses the data upon opening the app. Thereby, it can answer questions such as “How much time do I spend at home, work, and on my way to work?”. On top of that, it provides statistics such as how many kilometers the user travels per day, week, and month, and which places he or she spends most of his or her time at.

Moreover, the app leverages user feedback by providing several mechanisms e.g a Zendesk and Jira client to provide feedback to the developer.

The study started on April 5, 2013 and ended on April 23, 2013. Prior to that, the implementation of the app in both the iOS and Windows Phone version was done. The implementation took about 1.5 days per platform due to specialities such as the possibility to enable and disable the study remotely and converting feature prices to the users’ local currencies in price intervals which correspond to their smartphone operating system’s pricing intervals (i.e. convert 1€ to flat 70 rupee rather than to 71.54 rupees). Figure 2 represents the implementation of the approach on iOS (left) and Windows Phone (right).

Implementation

Figure 2. Implementation on iOS and Windows Phone.

Finding the Right Features to Implement

Figure 3 depicts the number of purchases that were made as well as the hypothetical revenue attached to those. Each of the six columns represents one feature at a given price tag. In total, there were two features being investigated, each of those at the price tags EUR 1, EUR 3, and EUR 5. As an example, SSF represents feature one and the number 3 represents that the base price (i.e. the price that was converted into the users’ currencies) was EUR 3.

revenue

Figure 3: Number of purchases and revenue.

As anticipated, figure 3 depicts a correlation between the number of purchases and the price of a feature. Moreover, it allows us to compare the two features and also allows us to make judgements about the feature pricing. For instance, EUR 5 for the HF feature yields a lower total revenue than EUR 3. In contrast to that, the maximum revenue for SSF seems to be at an even higher price tag than EUR 5. Moreover, the revenue created with SSF is higher than HF.

tooexpensive

Figure 4: Number of users who said that they did not buy the feature because it is too expensive.

Figure 4 shows the number of users who regarded the feature as too expensive when surveyed about why they cancelled the purchase (i.e. tried to navigate back from the feature’s description page after loading the price). It underlines the point that there is a correlation between a feature’s price and the number of purchases and also that fewer users are willing to pay for the feature HF at a price tag of EUR 5 than for SSF at the same price.

14% of the users were unsatisfied (i.e. selected I dislike the approach) or very unsatisfied (i.e. selected I am very annoyed by the approach). However, the wide majority of users understood the purpose of the experiment.

We showed that the approach allows us to find out about the real return of a feature and allows us to find the ideal price of a feature.

The study is described in the following article and will be presented at the Lean Enterprise Software and Systems Conference in Galway, Ireland (December 1-4, 2013). The reference for the article is the following:

  • [PDF] [DOI] Alexander-Derek Rein, Jürgen Münch. Feature Prioritization Based on Mock Purchase: A Mobile Case Study. In Proceedings of the Lean Enterprise Software and Systems Conference (LESS 2013, Galway, Ireland, December 1-4), volume 167 of LNBIP, pages 165-179. Springer Berlin Heidelberg, 2013.
    [Bibtex] [doi] [url] [pdf]
    @inproceedings{LESS2013b,
      author    = {Alexander-Derek Rein, Jürgen Münch},
      title     = {Feature Prioritization Based on Mock Purchase: A Mobile Case Study},
      booktitle = {Proceedings of the Lean Enterprise Software and Systems Conference (LESS 2013, Galway, Ireland, December 1-4)},
      publisher = {Springer Berlin Heidelberg},
    series = {LNBIP},
    volume = {167},
    pages = {165-179},
      year      = {2013},
    doi = {10.1007/978-3-642-44930-7_11},
    url = {http://link.springer.com/chapter/10.1007/978-3-642-44930-7_11}
    }