Posted:

Originally posted to the Google Cloud Platform blog

When you write applications that run on Google Compute Engine instances, you might want to connect them to Google Cloud Storage, Google BigQuery, and other Google Cloud Platform services. Those services use OAuth2, the global standard for authorization, to help ensure that only the right callers can make the right calls. Unfortunately, OAuth2 has traditionally been hard to use. It often requires specialized knowledge and a lot of boilerplate auth setup code just to make an initial API call.

Today, with Application Default Credentials (ADC), we're making things easier. In many cases, all you need is a single line of auth code in your app:

Credential credential = GoogleCredential.getApplicationDefault();

If you're not already familiar with auth concepts, including 2LO, 3LO, and service accounts, you may find this introduction useful.

ADC takes all that complexity and packages it behind a single API call. Under the hood, it makes use of:

  • 2-legged vs. 3-legged OAuth (2LO vs. 3LO) -- OAuth2 includes support for user-owned data, where the user, the API provider, and the application developer all need to participate in the authorization dance. Most Cloud APIs don't deal with user-owned data, and therefore can use much simpler two-party flows between the API provider and the application developer.
  • gcloud CLI -- while you're developing and debugging your app, you probably already use the gcloud command-line tool to explore and manage Cloud Platform resources. ADC lets your application piggyback on the auth flows in gcloud, so you only have to set up your credentials once.
  • service accounts -- if your application runs on Google App Engine or Google Compute Engine, it automatically has access to the built-in "service account", that helps the API provider to trust that the API calls are coming from a trusted source. ADC lets your application benefit from that trust.

You can find more about Google Application Default Credentials here. This is available for Java, Python, Node.js, Ruby, and Go. Libraries for PHP and .Net are in development.

Posted:

Posted by Jeanie Santoso, Merchandise Marketing Manager

Chromecast, our first Google Cast device, has seen great success with 17 million devices already sold and over 1.5 billion touches of the Cast button. Consumers now get all the benefits of their easy to use personal mobile devices, with content displayed on the largest and most beautiful screen in the house. By adding Google Cast functionality to their apps, developers can gain visits, engagement, and/or higher monetization. Here are four real-world examples showing how very different companies are successfully using Google Cast technology. Read on to learn more about their successes and how to get started.

Comedy Central sees 50% more videos viewed by Chromecast users

The Comedy Central app lets fans watch their favorite shows in full and on demand from mobile devices. The company created a cast-enabled app so users could bring their small screen experience to their TVs. Now with Chromecast, users watch at least 50 percent more video, with 1.5 times more visits than the average Comedy Central app user. “The user adoption and volume we saw immediately at launch was pleasantly surprising,” says Ben Hurst, senior vice president, mobile and emerging platforms, Viacom Music and Entertainment Group. “We feel that Chromecast was a clear success for the Comedy Central app.”

Read the full Comedy Central case study here

Just Dance Now sees 2.5x monetization with Chromecast users

Interactive-game giant Ubisoft adopted Google Cast technology as a new way to experience their Just Dance Now game. As the game requires a controller and a main screen on which the game is displayed, Ubisoft saw Chromecast as the simplest and most accessible way to play. Chromecast represents 30 percent of all songs launched on the game in the US. Chromecast players monetize 2.5 times more than other players - they’re more engaged, play longer and more often than other players. Ubisoft also has seen more long-term subscribers with Chromecast. “The best Just Dance Now experience is on a big screen, and Chromecast brings an amazingly quick launch and ease of use for players to get into the game,” says Björn Törnqvist, Ubisoft technical director.

Read the full Just Dance Now case study here

Fitnet sees 35% higher engagement in Chromecast users

Fitnet is a mobile app that delivers video workouts and converts your smartphone’s or tablet’s camera into a biometric sensor to intelligently analyze your synchronicity with the trainer. The app provides a real-time score based on the user’s individual performance. The company turned to Chromecast to provide a compelling, integrated big screen user experience so users don’t need to stare at a tiny display to follow along. Chromecast users now perform 35 percent better on key engagement metrics Fitnet regard as critical to their success”—metrics such as logins, exploring new workouts, and actively engaging in workout content. “Integrating with Google Cast technology has been an excellent investment of our time and resources, and a key feature that has helped us to develop a unique, compelling experience for our users,” Bob Summers, Fitnet founder and CEO.

Read the full Fitnet case study here

Haystack TV doubled average weekly viewing time

Haystack TV is a personal news channel that lets consumers watch news on any screen, at any time. The company integrated Google Cast technology so users can browse their headline news, choose other videos to play, and even remove videos from their play queue without disrupting the current video on their TV. With Chromecast, average weekly viewing time has doubled. One-third of Haystack TV users now view their news via Chromecast. “We’re in the midst of a revolution in the world of television. More and more people are ‘cutting the cord’ and favoring over-the-top (OTT) services such as Haystack TV,” says Ish Harshawat, Haystack TV co-founder. “Chromecast is the perfect device for watching Haystack TV on the big screen. We wouldn't be where we are today without Chromecast.”

Read the full Haystack TV case study here

Integrate with Google Cast technology today

More and more developers are discovering what Google Cast technology can do for their app. Check out the Google Cast SDK for API references and take a look at our great sample apps to help get you started.

To learn more, visit developers.google.com/cast

Posted:

Posted by Chandu Thota, Engineering Director and Matthew Kulick, Product Manager

Just like lighthouses have helped sailors navigate the world for thousands of years, electronic beacons can be used to provide precise location and contextual cues within apps to help you navigate the world. For instance, a beacon can label a bus stop so your phone knows to have your ticket ready, or a museum app can provide background on the exhibit you’re standing in front of. Today, we’re beginning to roll out a new set of features to help developers build apps using this technology. This includes a new open format for Bluetooth low energy (BLE) beacons to communicate with people’s devices, a way for you to add this meaningful data to your apps and to Google services, as well as a way to manage your fleet of beacons efficiently.

Eddystone: an open BLE beacon format

Working closely with partners in the BLE beacon industry, we’ve learned a lot about the needs and the limitations of existing beacon technology. So we set out to build a new class of beacons that addresses real-life use-cases, cross-platform support, and security.

At the core of what it means to be a BLE beacon is the frame format—i.e., a language—that a beacon sends out into the world. Today, we’re expanding the range of use cases for beacon technology by publishing a new and open format for BLE beacons that anyone can use: Eddystone. Eddystone is robust and extensible: It supports multiple frame types for different use cases, and it supports versioning to make introducing new functionality easier. It’s cross-platform, capable of supporting Android, iOS or any platform that supports BLE beacons. And it’s available on GitHub under the open-source Apache v2.0 license, for everyone to use and help improve.

By design, a beacon is meant to be discoverable by any nearby Bluetooth Smart device, via its identifier which is a public signal. At the same time, privacy and security are really important, so we built in a feature called Ephemeral Identifiers (EIDs) which change frequently, and allow only authorized clients to decode them. EIDs will enable you to securely do things like find your luggage once you get off the plane or find your lost keys. We’ll publish the technical specs of this design soon.


Eddystone for developers: Better context for your apps

Eddystone offers two key developer benefits: better semantic context and precise location. To support these, we’re launching two new APIs. The Nearby API for Android and iOS makes it easier for apps to find and communicate with nearby devices and beacons, such as a specific bus stop or a particular art exhibit in a museum, providing better context. And the Proximity Beacon API lets developers associate semantic location (i.e., a place associated with a lat/long) and related data with beacons, stored in the cloud. This API will also be used in existing location APIs, such as the next version of the Places API.

Eddystone for beacon manufacturers: Single hardware for multiple platforms

Eddystone’s extensible frame formats allow hardware manufacturers to support multiple mobile platforms and application scenarios with a single piece of hardware. An existing BLE beacon can be made Eddystone compliant with a simple firmware update. At the core, we built Eddystone as an open and extensible protocol that’s also interoperable, so we’ll also introduce an Eddystone certification process in the near future by closely working with hardware manufacturing partners. We already have a number of partners that have built Eddystone-compliant beacons.

Eddystone for businesses: Secure and manage your beacon fleet with ease

As businesses move from validating their beacon-assisted apps to deploying beacons at scale in places like stadiums and transit stations, hardware installation and maintenance can be challenging: which beacons are working, broken, missing or displaced? So starting today, beacons that implement Eddystone’s telemetry frame (Eddystone-TLM) in combination with the Proximity Beacon API’s diagnostic endpoint can help deployers monitor their beacons’ battery health and displacement—common logistical challenges with low-cost beacon hardware.

Eddystone for Google products: New, improved user experiences

We’re also starting to improve Google’s own products and services with beacons. Google Maps launched beacon-based transit notifications in Portland earlier this year, to help people get faster access to real-time transit schedules for specific stations. And soon, Google Now will also be able to use this contextual information to help prioritize the most relevant cards, like showing you menu items when you’re inside a restaurant.

We want to make beacons useful even when a mobile app is not available; to that end, the Physical Web project will be using Eddystone beacons that broadcast URLs to help people interact with their surroundings.

Beacons are an important way to deliver better experiences for users of your apps, whether you choose to use Eddystone with your own products and services or as part of a broader Google solution like the Places API or Nearby API. The ecosystem of app developers and beacon manufacturers is important in pushing these technologies forward and the best ideas won’t come from just one company, so we encourage you to get some Eddystone-supported beacons today from our partners and begin building!

Update (July 16, 2015 11.30am PST) To clarify, beacons registered with proper place identifiers (as defined in our Places API) will be used in Place Picker. You have to use Proximity Beacon API to map a beacon to a place identifier. See the post on Google's Geo Developer Blog for more details.

Posted:

Posted by Akshay Kannan, Product Manager

Mobile phones have made it easy to communicate with anyone, whether they’re right next to you or on the other side of the world. The great irony, however, is that those interactions can often feel really awkward when you're sitting right next to someone.

Today, it takes several steps -- whether it’s exchanging contact information, scanning a QR code, or pairing via bluetooth -- to get a simple piece of information to someone right next to you. Ideally, you should be able to just turn to them and do so, the same way you do in the real world.

This is why we built Nearby. Nearby provides a proximity API, Nearby Messages, for iOS and Android devices to discover and communicate with each other, as well as with beacons.

Nearby uses a combination of Bluetooth, Wi-Fi, and inaudible sound (using the device’s speaker and microphone) to establish proximity. We’ve incorporated Nearby technology into several products, including Chromecast Guest Mode, Nearby Players in Google Play Games, and Google Tone.

With the latest release of Google Play services 7.8, the Nearby Messages API becomes available to all developers across iOS and Android devices (Gingerbread and higher). Nearby doesn’t use or require a Google Account. The first time an app calls Nearby, users get a permission dialog to grant that app access.

A few of our partners have built creative experiences to show what's possible with Nearby.

Edjing Pro uses Nearby to let DJs publish their tracklist to people around them. The audience can vote on tracks that they like, and their votes are updated in realtime.

Trello uses Nearby to simplify sharing. Share a Trello board to the people around you with a tap of a button.

Pocket Casts uses Nearby to let you find and compare podcasts with people around you. Open the Nearby tab in Pocket Casts to view a list of podcasts that people around you have, as well as podcasts that you have in common with others.

Trulia uses Nearby to simplify the house hunting process. Create a board and use Nearby to make it easy for the people around you to join it.

To learn more, visit developers.google.com/nearby.

Posted:

Posted by Addy Osmani, Staff Developer Platform Engineer

Back in 2014, Google published the material design specification with a goal to provide guidelines for good design and beautiful UI across all device form factors. Today we are releasing our first effort to bring this to websites using vanilla CSS, HTML and JavaScript. We’re calling it Material Design Lite (MDL).

MDL makes it easy to add a material design look and feel to your websites. The “Lite” part of MDL comes from several key design goals: MDL has few dependencies, making it easy to install and use. It is framework-agnostic, meaning MDL can be used with any of the rapidly changing landscape of front-end tool chains. MDL has a low overhead in terms of code size (~27KB gzipped), and a narrow focus—enabling material design styling for websites.

Get started now and give it a spin or try one of our examples on CodePen.

MDL is a complimentary implementation to the Paper elements built with Polymer. The Paper elements are fully encapsulated components that can be used individually or composed together to create a material design-style site, and support more advanced user interaction. That said, MDL can be used alongside the Polymer element counterparts.

Out-of-the-box Templates

MDL optimises for websites heavy on content such as marketing pages, text articles and blogs. We've built responsive templates to show the broadness of sites that can be created using MDL that can be downloaded from our Templates page. We hope these inspire you to build great looking sites.

Blogs:

Text-heavy content sites:

Dashboards:

Standalone articles:

and more.

Technical details and browser support

MDL includes a rich set of components, including material design buttons, text-fields, tooltips, spinners and many more. It also include a responsive grid and breakpoints that adhere to the new material design adaptive UI guidelines.

The MDL sources are written in Sass using BEM. While we hope you'll use our theme customizer or pre-built CSS, you can also download the MDL sources from GitHub and build your own version. The easiest way to use MDL is by referencing our CDN, but you can also download the CSS or import MDL via npm or Bower.

The complete MDL experience works in all modern evergreen browsers (Chrome, Firefox, Opera, Edge) and Safari, but gracefully degrades to CSS-only in browsers like IE9 that don’t pass our Cutting-the-mustard test. Our browser compatibility matrix has the most up to date information on the browsers MDL officially supports.

More questions?

We've been working with the designers evolving material design to build in additional thinking for the web. This includes working on solutions for responsive templates, high-performance typography and missing components like badges. MDL is spec compliant for today and provides guidance on aspects of the spec that are still being evolved. As with the material design spec itself, your feedback and questions will help us evolve MDL, and in turn, how material design works on the web.

We’re sure you have plenty of questions and we have tried to cover some of them in our FAQ. Feel free to hit us up on GitHub or Stack Overflow if you have more. :)

Wrapping up

MDL is built on the core technologies of the web you already know and use every day—CSS, HTML and JS. By adopting MDL into your projects, you gain access to an authoritative and highly curated implementation of material design for the web. We can’t wait to see the beautiful, modern, responsive websites you’re going to build with Material Design Lite.

Posted:

Posted by Leon Nicholls, Developer Programs Engineer

Remote Display on Google Cast allows your app to display both on your mobile and Cast device at the same time. Processing is a programming language that allows artists and hobbyists to create advanced graphics and interactive exhibitions. By putting these two things together we were able to quickly create stunning visual art and display it on the big screen just by bringing our phone to the party or gallery. This article describes how we added support for the Google Cast Remote Display APIs to Processing for Android and how you can too.

An example app from the popular Processing toxiclibs library on Cast. Download the code and run it on your own Chromecast!

A little background

Processing has its own IDE and has many contributed libraries that hide the technical details of various input, output and rendering technologies. Users of Processing with just basic programming skills can create complicated graphical scenes and visualizations.

To write a program in the Processing IDE you create a “sketch” which involves adding code to life-cycle callbacks that initialize and draw the scene. You can run the sketch as a Java program on your desktop. You can also enable support for Processing for Android and then run the same sketch as an app on your Android mobile device. It also supports touch events and sensor data to interact with the generated apps.

Instead of just viewing the graphics on the small screen of the Android device, we can do better by projecting the graphics on a TV screen. Google Cast Remote Display APIs makes it easy to bring graphically intensive apps to Google Cast receivers by using the GPUs, CPUs and sensors available on the mobile devices you already have.

How we did it

Adding support for Remote Display involved modifying the Processing for Android Mode source code. To compile the Android Mode you first need to compile the source code of the Processing IDE. We started with the source code of the current stable release version 2.2.1 of the Processing IDE and compiled it using its Ant build script (detailed instructions are included along with the code download). We then downloaded the Android SDK and source code for the Android Mode 0232. After some minor changes to its build config to support the latest Android SDK version, we used Ant to build the Android Mode zip file. The zip file was unzipped into the Processing IDE modes directory.

We then used the IDE to open one of the Processing example sketches and exported it as an Android project. In the generated project we replaced the processing-core.jar library with the source code for Android Mode. We also added a Gradle build config to the project and then imported the project into Android Studio.

The main Activity for a Processing app is a descendent of the Android Mode PApplet class. The PApplet class uses a GLSurfaceView for rendering 2D and 3D graphics. We needed to change the code to use that same GLSurfaceView for the Remote Display API.

It is a requirement in the Google Cast Design Checklist for the Cast button to be visible on all screens. We changed PApplet to be an ActionBarActivity so that we can show the Cast button in the action bar. The Cast button was added by using a MediaRouteActionProvider. To only list Google Cast devices that support Remote Display, we used a MediaRouteSelector with an App ID we obtained from the Google Cast SDK Developer Console for a Remote Display Receiver.

Next, we created a class called PresentationService that extends CastRemoteDisplayLocalService. The service allows the app to keep the remote display running even when it goes into the background. The service requires a CastPresentation instance for displaying its content. The CastPresentation instance uses the GLSurfaceView from the PApplet class for its content view. However, setting the CastPresentation content view requires some changes to PApplet so that the GLSurfaceView isn’t initialized in its onCreate, but waits until the service onRemoteDisplaySessionStarted callback is invoked.

When the user selects a Cast device in the Cast button menu and the MediaRouter onRouteSelected event is called, the service is started with CastRemoteDisplayLocalService.startService. When the user disconnects from a Cast device using the Cast button, MediaRouter onRouteUnselected event is called and the service is stopped by using CastRemoteDisplayLocalService.stopService.

For the mobile display, we display an image bitmap and forward the PApplet touch events to the existing surfaceTouchEvent method. When you run the Android app, you can use touch gestures on the display of the mobile device to control the interaction on the TV. Take a look at this video of some of the Processing apps running on a Chromecast.

Most of the new code is contained in the PresentationService and RemoteDisplayHelper classes. Your mobile device needs to have at least Android KitKat and Google Play services version 7.5.71.

You can too

Now you can try the Remote Display APIs in your Processing apps. Instead of changing the generated code every time you export your Android Mode project, we recommend that you use our project as a base and simply copy your generated Android code and libraries to our project. Then simply modify the project build file and update the manifest to start the app with your sketch’s main Activity.

To see a more detailed description on how to use the Remote Display APIs, read our developer documentation. We are eager to see what Processing artists can do with this code in their projects.

Posted:

Posted by Angana Ghosh, Lead Product Manager, Google Fit

To help users keep track of their physical activity, we recently updated the Google Fit app with some new features, including an Android Wear watch face that helps users track their progress throughout the day. We also added data types to the Google Fit SDK and have new partners tracking data (e.g. nutrition, sleep, etc.) that developers can now use in their own apps. Find out how to integrate Google Fit into your app and read on to check out some of the cool new stuff you can do.

Distance traveled per day

The Google Fit app now computes the distance traveled per day. Subscribe to it using the Recording API and query it using the History API.

Calories burned per day

If a user has entered their details into the Google Fit app, the app now computes their calories burned per day. Subscribe to it using the Recording API and query it using the History API.

Nutrition data from LifeSum, Lose It!, and MyFitnessPal

LifeSum and Lose It! are now writing nutrition data, like calories consumed, macronutrients (proteins, carbs, fats), and micronutrients (vitamins and minerals) to Google Fit. MyFitnessPal will start writing this data soon too. Query it from Google Fit using the History API.

Sleep activity from Basis Peak and Sleep as Android

Basis Peak and Sleep as Android are now writing sleep activity segments to Google Fit. Query this data using the History API.

New workout sessions and activity data from even more great apps and fitness wearables!

Endomondo, Garmin, the Daily Burn, the Basis Peak and the Xiaomi miBand are new Google Fit partners that will allow users to store their workout sessions and activity data. Developers can access this data with permission from the user, which will also be shown in the Google Fit app.

How are developers using the Google Fit platform?

Partners like LifeSum, and Lose It! are reading all day activity to help users keep track of their physical activity in their favorite fitness app.

Runkeeper now shows a Google Now card to its users encouraging them to “work off” their meals, based on their meals written to Google Fit by other apps.

Instaweather has integrated Google Fit into a new Android Wear face that they’re testing in beta. To try out the face, first join this Google+ community and then follow the link to join the beta and download the app.

We hope you enjoy checking out these Google Fit updates. Thanks to all our partners for making it possible! Find out more about integrating the Google Fit SDK into your app.

Posted:

Posted by Leon Nicholls, Developer Programs Engineer and Antonio Fontan, Software Engineer

At Google I/O 2015 we announced the new Google Cast Remote Display APIs for Android and iOS that make it easy for mobile developers to bring graphically intensive apps or games to Google Cast receivers. Now you can use the powerful GPUs, CPUs and sensors of the mobile device in your pocket to render both a local display and a virtual one to the TV. This dual display model also allows you to design new game experiences for the display on the mobile device to show maps, game pieces and private game information.

We wanted to show you how easy it is to take an existing high performance game and run it on a Chromecast. So, we decided to port the classic Quake® III Arena open source engine to support Cast Remote Display. We reached out to ID Software and they thought it was a cool idea too. When all was said and done, during our 2015 I/O session “Google Cast Remote Display APIs for Games” we were able to present the game in 720p at 60 fps!

During the demo we used a wired USB game controller to play the game, but we've also experimented with using the mobile device sensors, a bluetooth controller, a toy gun and even a dance mat as game controllers.

Since you're probably wondering how you can do this too, here's the details of how we added Cast Remote Display to Quake. The game engine was not modified in any way and the whole process took less than a day with most of our time spent removing UI code not needed for the demo. We started by using an existing source port of Quake III to Android which includes some usage of kwaak3 and ioquake3 source code.

Next, we registered a Remote Display App ID using the Google Cast SDK Developer Console. There’s no need to write a Cast receiver app as the Remote Display APIs are supported natively by all Google Cast receivers.

To render the local display, the existing main Activity was converted to an ActionBarActivity. To discover devices and to allow a user to select a Cast device to connect to, we added support for the Cast button using MediaRouteActionProvider. The MediaRouteActionProvider adds a Cast button to the action bar. We then set the MediaRouteSelector for the MediaRouter using the App ID we obtained and added a callback listener using MediaRouter.addCallback. We modified the existing code to display an image bitmap on the local display.

To render the remote display, we extended CastPresentation and called setContentView with the game’s existing GLSurfaceView instance. Think of the CastPresentation as the Activity for the remote display. The game audio engine was also started at that point.

Next we created a service extending CastRemoteDisplayLocalService which would then create an instance of our CastPresentation class. The service will manage the remote display even when the local app goes into the background. The service automatically provides a convenient notification to allow the user to dismiss the remote display.

Then we start our service when the MediaRouter onRouteSelected event is called by using CastRemoteDisplayLocalService.startService and stop the service when the MediaRouter onRouteUnselected event is called by using CastRemoteDisplayLocalService.stopService.

To see a more detailed description on how to use the Remote Display APIs, read our developer documentation. We have also published a sample app on GitHub that is UX compliant.

You can download the code that we used for the demo. To run the app you have to compile it using Gradle or Android Studio. You will also need to copy the "baseq3" folder from your Quake III game to the “qiii4a” folder in the root of the SD card of your Android mobile device. Your mobile device needs to have at least Android KitKat and Google Play services version 7.5.71.

With 17 million Chromecast devices sold and 1.5 billion touches of the Cast button, the opportunity for developers is huge, and it’s simple to add this extra functionality to an existing game. We're eager to see what amazing experiences you create using the Cast Remote Display APIs.

QUAKE II © 1997 and QUAKE III © 1999 id Software LLC, a ZeniMax Media company. QUAKE is a trademark or registered trademark of id Software LLC in the U.S. and/or other countries. QUAKE game assets used under license from id Software LLC. All Rights Reserved

QIII4A © 2012 n0n3m4. GNU General Public License.

Q3E © 2012 n0n3m4. GNU General Public License.

Posted:

Originally posted to the Google Apps Developer blog

The Google Calendar API allows you to create and modify events on Google Calendar. Starting today, you can use the API to also attach Google Drive files to Calendar events to make them—and your app—even more useful and integrated. With the API, you can easily attach meeting notes or add PDFs of booking confirmations to events.

Here's how you set it up:

1) Get the file information from Google Drive (e.g. via the Google Drive API):

GET https://www.googleapis.com/drive/v2/files

{
 ...
 "items": [
  {
   "kind": "drive#file",
   "id": "9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q,
   ...
   "alternateLink": "https://docs.google.com/presentation/d/9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q/edit?usp=drivesdk",
   "title": "Workout plan",
   "mimeType": "application/vnd.google-apps.presentation",
   ...
  },
  ...
 ]
}

2) Pass this information into an event modification operation using the Calendar API:

POST https://www.googleapis.com/calendar/v3/calendars/primary/events?supportsAttachments=true

{
  "summary": "Workout",
  "start": { ... },
  "end": { ... },
  ...
  "attachments": [
   {
      "fileUrl": "https://docs.google.com/presentation/d/9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q/edit?usp=drivesdk",
      "title": "Workout plan",
      "mimeType": "application/vnd.google-apps.presentation"
   },
   ...
  ]
}

Voilà!

You don’t need to do anything special in order to see the existing attachments - they are now always exposed as part of an event:

GET https://www.googleapis.com/calendar/v3/calendars/primary/events/ja58khmqndmulcongdge9uekm7

{
 "kind": "calendar#event",
 "id": "ja58khmqndmulcongdge9uekm7",
 "summary": "Workout",
 ...
 "attachments": [
  {
   "fileUrl": "https://docs.google.com/presentation/d/9oNKwQI7dkW-xHJ3eRvTO6Cp92obxs1kJsZLFRGFMz9Q/edit?usp=drivesdk",
   "title": "Workout plan",
   "mimeType": "application/vnd.google-apps.presentation",
   "iconLink": "https://ssl.gstatic.com/docs/doclist/images/icon_11_presentation_list.png"
  },
  ...
 ]
}

Check out the guide and reference in the Google Calendar API documentation for additional details.

For any questions related to attachments or any other Calendar API features you can reach out to us on StackOverflow.com, using the tag #google-calendar.

Posted:

Posted by Ranjith Jayaram, Product Manager

If you’re looking to drive usage and grow a mobile app, you’re probably testing out referrals, recommendations, and the user onboarding experience. These product flows are resource-intensive to design, build, and optimize. What if you could use a set of tools that help your users share your app, and get more of the right people to download and use your app? What if you could craft a more personalized onboarding experience in your new user’s journey?

Now in beta, App Invites let mobile app developers increase their reach, deep link new users to custom experiences, and tap into your users’ device and Google-wide contacts as a source to drive referrals. This is available for both iOS and Android app developers. We’re launching with UrbanSitter, Yummly, The CW, Coinbase and Picsart apps.

Here’s what some of our early partners had to say:

  • For Andrea Barrett, co-founder and VP of Product at UrbanSitter, “App Invites gives our members the ability to easily share favorite sitters with their friends and Google contacts. As a service targeting busy parents, our user growth thrives on social recommendations and word-of-mouth referrals, so Google’s app invites are a natural fit for us.”
  • Sharing is an important part of TV network The CW’s app growth strategy. “Tools that help fans of our shows recommend The CW app to their contacts and friends are important. App Invites let’s specific users share their favorite shows with selected friends; it's the next evolution of ‘word of mouth’. We’re integrating App Invites into our episode sharing capabilities, so that fans can speak to each other about the The CW app.” - Zach Mannon, Director of Digital Media at Warner Brothers Television
  • For PicsArt, their fast growth to 250M installs has been driven by word-of-mouth. “Google’s new App Invites will accelerate our organic growth even further, giving people the opportunity to proactively invite their friends to join our mission of beautifying the world!" - Arusiak Kanetsyan, Director of Content and Communication
  • Yummly integrated app invites to expand their user base and generate awareness of their app, by allowing people to suggest the app to those who love to cook or are interested in food. “We see this expanding beyond just inviting new users to join our app. In the future, we hope to use this to share different meal ideas and have the opportunity to share your shopping list with family members or even inviting friends over for dinner. With the power of Google and personalized app invites, making a dinner everyone agrees on will never been easier." -- Brian Witlin, Chief Operating Officer

App Invites is available on both Android and iOS. Here’s what’s in the beta toolkit:

  1. Use app invites for expanded reach: Tap into SMS and email invites via your user’s phone and Google contacts.
  2. Make it easy for your users to send invites. We’ll recommend their closest contacts to share your app with, and suggest a preferred method of delivery.
  3. Send actionable invite cards: Include an install button right in an email invite.
  4. Faster Android install flows: Your new Android users can click App Invite and download your app straight from the Play Store, bypassing the browser windows that usually open in between.
  5. Create personalized onboarding flows: New users can get deep linked into a specific onboarding experience - for example, you can offer custom discount codes or content.
  6. Measure app invites using Google Analytics custom reports.

With App Invites, our goal is to take the hard work out of building user referral and onboarding flows and build a toolkit that works across platforms, so that you can focus on your core app experience. Visit https://developers.google.com/app-invites/ to get started To learn about opportunities to re-engage app users using goo.gl deep links, check out this post.