Skip to main content
USENIX
  • Conferences
  • Students
Sign in
  • Home
  • Attend
    • Registration Information
    • Registration Discounts
    • Venue, Hotel, and Travel
    • Students and Grants
    • Co-located Events
      • USENIX ATC '15
      • HotStorage '15
  • Program
    • Workshop Program
  • Activities
    • Birds-of-a-Feather Sessions
  • Sponsorship
  • Participate
    • Call for Papers
    • Instructions for Participants
  • About
    • Workshop Organizers
    • Help Promote!
    • Questions
    • Past Workshops
  • Home
  • Attend
  • Program
  • Activities
  • Sponsorship
  • Participate
  • About

sponsors

Silver Sponsor
Bronze Sponsor
Bronze Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Media Sponsor
Industry Partner

help promote

HotCloud '15 button

connect with us


  •  Twitter
  •  Facebook
  •  LinkedIn
  •  Google+
  •  YouTube

twitter

Tweets by @usenix

usenix conference policies

  • Event Code of Conduct
  • Conference Network Policy
  • Statement on Environmental Responsibility Policy

You are here

Home » Towards Hybrid Programming in Big Data
Tweet

connect with us

Towards Hybrid Programming in Big Data

Authors: 

Peng Wang, Chinese Academy of Sciences; Hong Jiang, University of Nebraska–Lincoln; Xu Liu, College of William and Mary; Jizhong Han, Chinese Academy of Sciences

Abstract: 

Within the past decade, there have been a number of parallel programming models developed for data-intensive (i.e., big data) applications. Typically, each model has its own strengths in performance or programmability for some kinds of applications but limitations for others. As a result, multiple programming models are often combined in a complimentary manner to exploit their merits and hide their weaknesses. However, existing models can only be loosely coupled due to their isolated runtime systems.

In this paper, we present Transformer, the first system that supports hybrid programming models for data-intensive applications. Transformer has two unique contributions. First, Transformer offers a programming abstraction in a unified runtime system for different programming model implementations, such as Dryad, Spark, Pregel, and PowerGraph. Second, Transformer supports an efficient and transparent data sharing mechanism, which tightly integrates different programming models in a single program. Experimental results on Amazon’s EC2 cloud show that Transformer can flexibly and efficiently support hybrid programming models for data-intensive computing.

Peng Wang, Chinese Academy of Science

Hong Jiang, University of Nebraska–Lincoln

Xu Liu, College of William and Mary

Jizhong Han, Chinese Academy of Science

Open Access Media

USENIX is committed to Open Access to the research presented at our events. Papers and proceedings are freely available to everyone once the event begins. Any video, audio, and/or slides that are posted after the event are also free and open to everyone. Support USENIX and our commitment to Open Access.

Wang PDF
View the slides
  • Log in or    Register to post comments

Silver Sponsors

Bronze Sponsors

Media Sponsors & Industry Partners

© USENIX

  • Privacy Policy
  • Contact Us