Developers Archives - Phunware Engage Anyone Anywhere Mon, 15 Apr 2024 16:52:48 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.2 Tech Blog: Engineering Best Practices http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/engineering-best-practices/ Sun, 14 Apr 2024 18:31:21 +0000 http://127.0.0.1/?p=42738 Explore essential mobile app engineering best practices with Phunware as we share insights from years of experience since 2009. Learn about feature flags, externalized configurations...

The post Tech Blog: Engineering Best Practices appeared first on Phunware.

]]>
At Phunware, we’ve been building mobile apps since 2009. Along the way, we’ve compiled a large list of Engineering Best Practices. In this blog post, we share some of the most important ones we follow.

When defining a new feature for a mobile app, it’s important to follow best practices to ensure the feature is complete, stable, and easy to maintain.

Let’s use a new Leaderboard screen as an example. A less experienced manager may write user stories for the Engineering team asking them to satisfy acceptance criteria that ensures the proper UX is followed, the UI matches the designs, and the points are populated by the appropriate data source. But there is so much more to consider.

Feature Flags

It’s imperative that a mobile app has an external config file. This file will typically contain various configuration settings, urls, strings, etc that an app needs before it launches. Phunware’s Content Management Engine is a great place for developers to create a JSON-based app config file. 

Feature flags are an important component of any config file. Feature flags are simply set to true or false and determine whether a feature should be enabled or not. Using our Leaderboard screen example, we may not want to launch the Leaderboard feature until the first of the month. We can go live with the app release, but keep the flag set to false until we’re ready for users to experience it in production. 

This is also helpful if an issue occurs and the data populating the Leaderboard is corrupt. Rather than delivering a poor user experience, we can temporarily disable the Leaderboard until the issue is resolved.

Externalized Strings & Images

The app config file is also a great place to externalize text strings and image URLs.

Let’s say there’s a typo in our Leaderboard screen or a Product Manager simply wants to change the copy. It’s much quicker and easier to update the text in the config file than to make the changes in code, submit to the stores, wait for approval, and then try to get all our users on the latest app version.

At Phunware, we actually take this a step further and externalize strings for each language. For example, we may have a strings_en key for English strings and a strings_es key for Spanish strings. We serve the appropriate text depending on the user’s language settings.

Externalizing image URLs is also helpful when we want to change images on-the-fly. We’re always uploading new images to Phunware’s Asset Manager and updating URLs.

Analytics
After launching a great feature, we’re going to want to know how it performs. Are users visiting the Leaderboard screen? Are they interacting with the filters?

Analytics is often an afterthought. If we train ourselves to write a corresponding analytics ticket whenever we write a feature ticket, we’ll find that our coverage will be very complete.

Phunware Analytics is a great tool for capturing app launches, unique users, retention cohorts, screen views, and custom event analytics.

Error Handling

So we wrote user story tickets for the Leaderboard screen and the developers have finished implementing it. But what happens when the API goes down and returns a 500 error? Will the app provide an informative error message, a generic one, or simply get into a bad state?

By writing an error handling ticket we can define the behavior we would like when something goes wrong. At Phunware, we like to use a mix of specific and generic error messages.

For the Leaderboard example it may be more appropriate to display a message such as “Unable to load Leaderboard data. Please try again later” rather than “An unexpected error has occurred”. However, the latter might be best as a catch all for any situation where a specific error message wasn’t implemented.

Deep Links

Chances are, if the new Leaderboard is doing well, someone is going to ask if they can send a push notification promoting the Leaderboard which, when tapped, sends users to the Leaderboard screen.

If we considered deep links when writing the initial user stories then we’re probably covered. These days there are many places that may link into an app. Deep links can come from push notifications, app share URLs, emails, or even websites redirecting to the mobile app.

Considering deep links when implementing a new screen saves the time and overhead of having to do the work in a follow up release.

Offline Caching

There are times that our users have a poor network connection. Perhaps they are on a train or their WiFi is down. Ideally we realized that this may occur and tried to create a good user experience when it does.

At Phunware, we make sure to cache as much content as possible. If the user launches the app without an internet connection we’ll still display the images and content that were available the last time they launched the app.

While it’s possible some of the content is outdated, this is a better experience than showing a blank screen.

Displaying a banner at the top showing the user doesn’t appear to have a connection is also helpful in informing the user they are being shown something slightly different than if they had a good connection.

Unit Tests

We try to cover as much of our code as possible with Unit Tests and write them when developing new features.
Unit Tests allow developers to be confident the feature works as expected. We set up our build jobs to run unit tests when new builds are being generated, so a few months down the road, if a developer introduces a regression, we catch it right away. This frees up our QA team to focus on edge case issues rather than discovering breaking changes.

Documentation

So we wrote the user stories the Engineering team needed to implement the Leaderboard screen. Everything has been externalized, deep links have been tested, analytics are in place, and unit tests are all passing. Now it’s time to update the documentation.

Keeping documentation up to date is very important as codebases and feature sets are always changing. Ensuring we have proper documentation allows team members to quickly look up that deep link URL scheme or the analytics event that fires when a user toggles something in the Leaderboard.

In addition to documentation, this is also a great time to update submission checklists and QA test plans, since we’ll want to make sure the new Leaderboard is tested with each new release.

Store Guidelines

Our final best practice to follow is keeping up to date with Google and Apple’s App Store Review Guidelines. We check these weekly because we never know when new guidelines will be announced.

It’s critical to know these before the feature is completed and the app is submitted. There’s nothing worse than getting rejected because we violated a guideline. At that point any deadline we had to launch the app went out the window.

For example, there’s a guideline that requires that any app that allows users to create accounts must also provide a mechanism for users to delete their account. If we knew this when writing that user story for Sign Up and Log In, then we’re covered. If we found out the hard way, then we’ve lost precious time because it may be another sprint or two before the Engineering team can deliver that new flow.

Luckily we followed the other best practices and we’re able to disable it for now!

The post Tech Blog: Engineering Best Practices appeared first on Phunware.

]]>
Dev Blog: Barcode Scanning on iOS http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/dev-blog-barcode-scanning-ios/ Thu, 03 Nov 2022 15:59:49 +0000 http://127.0.0.1/blog/dev-blog-swift-regex-copy/ Learn how to build an iOS barcode scanner that can scan machine readable codes and about what approach might be best for your use case.

The post Dev Blog: Barcode Scanning on iOS appeared first on Phunware.

]]>
In this tutorial you will learn how to build a barcode scanner that can scan machine readable codes (QR, codabar, etc). You will also learn about the various approaches and which one might be best for your use case.

There are many ways to build a code scanner on iOS. Apple’s Vision Framework introduced additional options. We will first go over the classic tried and true method for creating a code scanner, then we will go over the new options. We will carefully consider the pros and cons for each approach.

1. The Classic Approach

Throughout the years most scanners on iOS have likely taken the following approach.

First AVFoundation is used to set up a video capture session, highlighted in gray in the diagram above. Then an AVCaptureMetaDataOutput object is hooked up to the video session’s output. AVCaptureMetaDataOutput is then set up to emit barcode information which is extracted from an AVMetadataObject (highlighted in blue).

Pros:

  • When it comes to scannable code formats, there aren’t any formats that are exclusive to the newer approaches. Click here to see a full list of supported code formats.
  • Minimum deployment target is iOS 6. This approach will likely accommodate any OS requirements that you may have.
  • This approach is tried and true. This means there are plenty of code examples and stack overflow questions.

Cons:

  • The maximum number of codes that can be scanned at a time is limited. For 1d codes we are limited to one detection at a time. For 2d codes we are limited to four detections at a time. Click here to read more.
  • Unable to scan a mixture of code types. For example a barcode and a QR code can’t be scanned in one go, instead we must scan them individually.
  • The lack of machine learning may cause issues when dealing with problems like a lack of focus or glare on images.
  • Developers have reported issues when supporting a variety of code types on iOS 16. A solution could be to use one of the newer approaches for your users on iOS 16 and above.

2. AVFoundation and Vision

For the following approach the basic idea is to feed an image to the Vision Framework. The image is generated using an AVFoundation capture session, similar to the first approach. Click here for an example implementation.

Notice the three Vision Framework classes in the diagram above (in blue). The entry point to the Vision Framework is the VNImageRequestHandler class. We initialize an instance of VNImageRequestHandler using an instance of CMSampleBufferRef.

Note: VNImageRequestHandler ultimately requires an image for Vision to process. When initialized with CMSampleBufferRef the image contained within the CMSampleBufferRef is utilized. In fact there are other initialization options like CGImage, Data, and even URL. See the full list of initializers here.

VNImageRequestHandler performs a Vision request using an instance of VNDetectBarcodesRequest. VNDetectBarcodesRequest is a class that represents our barcode request and returns an array of VNBarcodeObservation objects through a closure.

We get important information from VNBarcodeObservation, for example:

  • The barcode payload which is ultimately the data we are looking for.
  • The symbology which helps us differentiate observations/results when scanning for various types of codes (barcode, QR, etc) simultaneously.
  • The confidence score which helps us determine the accuracy of the observation/result.

In summary, it took three steps to setup Vision:

  1. Initialize an instance of VNImageRequestHandler.
  2. Use VNImageRequestHandler to perform a Vision request using an instance of VNDetectBarcodeRequest.
  3. Set up VNDetectBarcodeRequest to return our results, an array of VNBarcodeObservation objects.

Pros:

  • Computer Vision and Machine Learning algorithms – The Vision Framework is constantly improving. In fact, at the time of writing Apple is on its third revision of the barcode detection algorithm.
  • Customization – Since we are manually hooking things up we are able to customize the UI and the Vision Framework components.
  • Ability to scan a mixture of code formats at once. This means we can scan multiple codes with different symbologies all at once.

Cons:

  • Minimum deployment target of iOS 11, keep in mind that using the latest Vision Framework features will increase the minimum deployment target.
  • Working with new technology can have its downsides. It may be hard to find tutorials, stack overflow questions, and best practices.

3. DataScannerViewController

If the second approach seemed a bit too complicated, no need to worry. Apple introduced DataScannerViewController which abstracts the core of the work we did in the second approach. Although it’s not exclusive to scannable codes, it can also scan text. This is similar to what Apple did with UIImagePickerViewController, in the sense that it’s a drop in view controller class that abstracts various common processes into a single UIViewController class. Apple provides a short article that introduces the new DataScannerViewController class and walks through the required setup and configuration.

Pros:

  • Easy to use and setup.
  • Low maintenance, Apple is in charge of maintaining the code.
  • Can also scan text, not exclusive to machine readable codes.

Cons:

  • Minimum deployment target of iOS 16.
  • Only available on devices with the A12 Bionic chip and later.
  • Limited control over the UI, even if the UI looks great sometimes we may require something more complex.

Conclusion

We went over the various ways to scan machine readable codes on iOS. We explored the pros and cons of each approach. Now you should be ready to use this knowledge to build or improve on a barcode scanner.

Who knows, you may even choose to take a hybrid approach in order to take advantage of the latest and greatest that Apple has to offer while gracefully downgrading for users on older iOS devices.

The post Dev Blog: Barcode Scanning on iOS appeared first on Phunware.

]]>
Dev Blog: Swift Regex http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/dev-blog-swift-regex/ Thu, 20 Oct 2022 14:55:43 +0000 http://127.0.0.1/blog/dev-blog-what-developers-should-know-notification-permission-android-13-copy/ Learn more about Swift's new set of APIs allowing developers to write regular expressions (regex) that are more robust and easy to understand.

The post Dev Blog: Swift Regex appeared first on Phunware.

]]>
Introduction

A regular expression (regex) is a sequence of characters that defines a search pattern which can be used for string processing tasks such as find/replace and input validation. Working with regular expressions in the past using NSRegularExpression has always been challenging and error-prone. Swift 5.7 introduces a new set of APIs allowing developers to write regular expressions that are more robust and easy to understand.

Regex Literals

Regex literals are useful when the regex pattern is static. The Swift compiler can check for any regex pattern syntax errors at compile time. To create a regular expression using regex literal, simply wrap your regex pattern by the slash delimiters /…/

let regex = /My flight is departing from (.+?) \((\w{3}?)\)/

Notice the above regex literal also has captures defined in the regex pattern using the parentheses (…). A capture allows information to be extracted from a match for further processing. After the regex is created, we then call wholeMatch(of:) on the input string to see if there’s a match against the regex. A match from each capture will be appended to the regex output (as tuples) and can be accessed by element index. .0 would return the whole matched string, and .1 and .2 would return matches from the first and second captures, respectively.

let input = "My flight is departing from Los Angeles International Airport (LAX)"

if let match = input.wholeMatch(of: regex) {
    print("Match: \(match.0)")
    print("Airport Name: \(match.1)")
    print("Airport Code: \(match.2)")
}
// Match: My flight is departing from Los Angeles International Airport (LAX)
// Airport Name: Los Angeles International Airport
// Airport Code: LAX

You can also assign a name to each capture by prefixing ?<capture_name> to the regex pattern, that way you can easily reference the intended match result like the example below:

let regex = /My flight is departing from (?<name>.+?) \((?<code>\w{3}?)\)/

if let match = input.wholeMatch(of: regex) {
    print("Airport Name: \(match.name)")
    print("Airport Code: \(match.code)")
}
// Airport Name: Los Angeles International Airport
// Airport Code: LAX

Regex

Along with regex literals, a Regex type can be used to create a regular expression if the regex pattern is dynamically constructed. Search fields in editors is a good example where dynamic regex patterns may be needed. Keep in mind that Regex type will throw a runtime exception if the regex pattern is invalid. You can create a Regex type by passing the regex pattern as a String. Note that an extended string literal #”…”# is used here so that escaping backslashes within the regex is not required.

Regex Builder

Another great tool for creating regular expressions is called regex builder. Regex builder allows developers to use domain-specific language (DSL) to create and compose regular expressions that are well structured. As a result, regex patterns become very easy to read and maintain. If you are already familiar with SwiftUI code, using regex builder will be straightforward.

The following input data represents flight schedules which consists of 4 different fields: Flight date, departure airport code, arrival airport code, and flight status.

let input =
""" 
9/6/2022   LAX   JFK   On Time
9/6/2022   YYZ   SNA   Delayed
9/7/2022   LAX   SFO   Scheduled
"""

let fieldSeparator = OneOrMore(.whitespace)


let regex = Regex { 
    Capture {
        One(.date(.numeric, locale: Locale(identifier: "en-US"), timeZone: .gmt)) 
    } 
    fieldSeparator
    Capture { 
        OneOrMore(.word) 
    } 
    fieldSeparator
    Capture { 
        OneOrMore(.word)
    }
    fieldSeparator
    Capture { 
        ChoiceOf {
            "On Time"
            "Delayed"
            "Scheduled"
        }
    }
}

Quantifiers like One and OneOrMore are regex builder components allowing us to specify the number of occurrences needed for a match. Other quantifiers are also available such as Optionally, ZeroOrMore, and Repeat.

To parse the flight date, we could have specified the regex pattern using a regex literal /\d{2}/\d{2}/\d{4}/ for parsing the date string manually. In fact, there’s a better way for this. Luckily, regex builder supports many existing parsers such as DateFormatter, NumberFormatter and more provided by the Foundation framework for developers to reuse. Therefore, we can simply use a DateFormatter for parsing the flight date.

Each field in the input data is separated by 3 whitespace characters. Here we can declare a reusable pattern and assign it to a fieldSeparator variable. Then, the variable can be inserted to the regex builder whenever a field separator is needed.

Parsing the departure/arrival airport code is straightforward. We can use the OneOrMore quantifier and word as the type of character class since these airport codes consist of 3 letters.

Finally, ChoiceOf lets us define a fixed set of possible values for parsing the flight status field.

Once we have a complete regex pattern constructed using regex builder, calling matches(of:) on the input string would return enumerated match results:

for match in input.matches(of: regex) {
    print("Flight Date: \(match.1)")
    print("Origin: \(match.2)")
    print("Destination: \(match.3)")
    print("Status: \(match.4)")
    print("========================================")
}
// Flight Date: 2022-09-06 00:00:00 +0000
// Origin: LAX
// Destination: JFK
// Status: On Time 
// ======================================== 
// Flight Date: 2022-09-06 00:00:00 +0000 
// Origin: YYZ 
// Destination: SNA 
// Status: Delayed 
// ======================================== 
// Flight Date: 2022-09-07 00:00:00 +0000 
// Origin: LAX 
// Destination: SFO 
// Status: Scheduled 
// ========================================

Captures can also take an optional transform closure which would allow captured data to be transformed to a custom data structure. We can use the transform closure to convert the captured value (as Substring) from the flight status field into a custom FlightStatus enum making it easier to perform operations like filtering with the transformed type.

enum FlightStatus: String {
    case onTime = "On Time"
    case delayed = "Delayed"
    case scheduled = "Scheduled"
}

let regex = Regex { 
    ...
    Capture { 
        ChoiceOf {
            "On Time"
            "Delayed"
            "Scheduled"
        }
    } transform: {
        FlightStatus(rawValue: String($0))
    }
}
// Status: FlightStatus.onTime

Final Thoughts

Developers who want to use these new Swift Regex APIs may question which API they should adopt when converting existing code using NSRegularExpression or when writing new code that requires regular expressions? The answer is, it really depends on your requirements. Each of the Swift Regex APIs has its own unique advantage. Regex literals are good for simple and static regex patterns that can be validated at compile time. Regex type is better suited for regex patterns that are constructed dynamically during runtime. When working with a large input data set requiring more complex regex patterns, regex builder lets developers build regular expressions that are well structured, easy to understand and maintain.

Learn More

The post Dev Blog: Swift Regex appeared first on Phunware.

]]>
Dev Blog: What Developers Should Know About the Notification Permission in Android 13 http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/dev-blog-what-developers-should-know-notification-permission-android-13/ Tue, 04 Oct 2022 16:09:56 +0000 http://127.0.0.1/blog/navigating-permission-changes-in-ios-14-copy/ How does Android 13's new notification permission affect the ability of apps to post notifications? Learn more in Phunware's latest dev blog.

The post Dev Blog: What Developers Should Know About the Notification Permission in Android 13 appeared first on Phunware.

]]>
@media handheld, only screen and (max-width: 768px) { .image-right-caption { float: none !important; margin: 50px 0 50px 0 !important; } }

Android 13 introduces a new runtime permission, android.permission.POST_NOTIFICATIONS, which apps will need to obtain to display some types of notifications. How does this change the ability of apps to post notifications? I’ll attempt to answer that and more in this post. My own research found answers that surprised me.

Why does an app need POST_NOTIFICATIONS permission?

Figure 1: Android 13 system dialog for notifications permission.

The POST_NOTIFICATIONS permission only exists on Android 13 (the permission value “android.permission.POST_NOTIFICATIONS” is only available in code when an app compiles to API 33). When the app is running on devices with Android 12 and lower, POST_NOTIFICATIONS permission is not needed (and, actually, should not be used, more on this later). On Android 13, some notifications can still be displayed without this permission, such as notifications for foreground services or media sessions as described in the documentation. On Android 13, you can think of this permission as having the same value as the app system setting to enable notifications but you can ask the user to enable notifications like a permission without sending them to the system settings screen.

How can my app check if it has POST_NOTIFICATIONS permission?

As a runtime permission, you would think the obvious way to check for this permission is to call checkSelfPermission with the POST_NOTIFICATIONS permission. But this does not work as expected on pre-Android 13 devices. On pre-Android 13 devices, checkSelfPermission(POST_NOTIFICATIONS) will always return that the permission is denied even when notifications have been enabled in the app system settings. So, don’t call checkSelfPermission(POST_NOTIFICATIONS) if the app is not running on Android 13. Calling areNotificationsEnabled() is still the way to check that the user has enabled notifications for your app. To put it another way, only on Android 13 will checkSelfPermission(POST_NOTIFICATIONS) and areNotificationsEnabled() give you the same answer of whether that app has notifications enabled or not.

How can my app get POST_NOTIFICATIONS permission?

First, even apps that do not ask for POST_NOTIFICATIONS permission (such as apps that have not yet been updated to API 33 to know about this permission) may still obtain it. If an app is already installed, and has notifications enabled, and the device updates to Android 13, the app will be granted the permission to continue to send notifications to users. Similarly, if a user gets a new device with Android 13 and restores apps using the backup and restore feature, those apps will be granted POST_NOTIFICATIONS permission, if notifications were enabled.

For newly installed apps, if an app targets API 32 or lower, the system shows the permission dialog (see Figure 1) the first time your app starts an activity and creates its first notification channel. This is why you will see the permission dialog for apps that have not yet been updated for Android 13.

But as a developer, I was looking to add requesting the POST_NOTIFICATIONS permission to apps. Here’s the code I used:

    private val requestPermissionLauncher =
        registerForActivityResult(
            ActivityResultContracts.RequestPermission()
        ) { isGranted: Boolean ->
            onNotificationPermission(isGranted)
        }
…
        requestPermissionLauncher.launch(POST_NOTIFICATIONS)

Like checkSelfPermission(), this did not work the way I expected. On pre-Android 13 devices, requesting the POST_NOTIFICATIONS permission will always return PERMISSION_DENIED without displaying the system dialog. Also, if the app targets API 32 or lower, requesting the POST_NOTIFICATIONS permission will always return PERMISSION_DENIED without displaying the system dialog, even on devices with Android 13. So to request, the POST_NOTIFICATIONS permission at runtime:

  • Only request it on Android 13 or later
  • Your app must target API 33 or later

Do I need to update my app?

Yes, you should update your app if you don’t want the app to lose the ability to display notifications. Because of the situations described above where an app can get the POST_NOTIFICATIONS permission even when no code asks for it, you may be tempted to procrastinate just a little longer before handling this new permission. But remember the auto-reset permissions for unused apps feature introduced with Android 11 and later rolled out to earlier versions. This feature applies to runtime permissions so it applies to the new POST_NOTIFICATIONS permission. Expect that an app will lose this permission as well if it is not used for some time, so it will need to request it to get it back.

The post Dev Blog: What Developers Should Know About the Notification Permission in Android 13 appeared first on Phunware.

]]>
Navigating Permission Changes in iOS 14 http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/navigating-permission-changes-in-ios-14/ Tue, 08 Sep 2020 17:30:55 +0000 http://127.0.0.1/blog/phunware-new-big-four-customer-copy/ We break down iOS 14 upcoming changes and provides recommendations on how best to handle the new privacy-related permission prompts.

The post Navigating Permission Changes in iOS 14 appeared first on Phunware.

]]>
h2 { color: #0080ff !important; font-size: 30px !important; margin-bottom: 0.5em !important; margin-top: 1em !important; } h4 { margin-top: 2em; }

When it launches this fall, iOS 14 will bring several new permission changes for requesting access to the user’s location, advertising id, photos, and local network. This blog breaks down the upcoming changes and provides recommendations on how best to handle these new privacy-related permission prompts.

Location Permission Changes

Apple is continuing with its commitment to give app users more control over their data and privacy. Last year, with the release of iOS 13, Apple gave users the option to decide if the app should have access to their location only once, only while using the app, or always. 

This year, with the release of iOS 14, Apple will build upon that and allow users also to decide if the app should have access to their precise location or just their approximate location.

New: Precise Location Toggle

When an app requests the user’s location, the user will be presented with a permission prompt asking for location access with the same options as iOS 13: Allow Once, Allow While Using App, or Don’t Allow. 

Like with previous versions of iOS, the title of the permission prompt is controlled by Apple, but the app developer configures the subtext. The subtext is intended to provide the user with an explanation of why the app is requesting this permission. 

What’s new in iOS 14 is the user’s ability to toggle precise location on and off. The precise setting is enabled by default, which means the app will get the user’s fine location. If the user disables this, then the app will only get the user’s approximate location. In our tests, the approximate location may return location coordinates for a user up to 2 miles away. 

New: Temporary Request for Precise Location

Another change is the app’s ability to temporarily request precise location if the user previously only allowed approximate accuracy. This is a one-time permission request that, if granted, only lasts during the duration of the app session. According to Apple, “This approach to expiration allows apps to provide experiences that require full accuracy, such as fitness and navigation apps, even if the user doesn’t grant persistent access for full accuracy.”

Background Location Permission

App developers may need the user’s location in the background to support features such as Geofence notifications. Same as in iOS 13, Apple doesn’t allow this option on the first request but instead allows the app developer to request this permission at a later time. If your app requested Always Allow permission, then this prompt will be displayed automatically the next time the user launches the app, but typically not the same day the initial prompt was displayed.

Once an app has received the user’s location in the background a significant number of times, Apple will inform the user and ask them if they want to continue allowing this. This is also unchanged from iOS 13. 

New: Updated Location Settings

Users can adjust their location settings in the iOS Settings app by navigating to Privacy → Location Services → App Name.

Users will have the option to adjust their location access to Never, Ask Next Time, While Using the App, or Always.

If a user receives a location permission prompt and selects Allow Once, their location setting will be Ask Next Time, prompting them to make a selection again the next time the app requests their location.

What’s new in iOS 14 is the Precise Location toggle, which allows users to switch between precise and approximate location.

Impact

The most significant impact of these changes will be on apps that require a precise location, such as navigation apps or apps that use geofence notifications. Given that an approximate location could put the user miles away, the precise location option is required for these apps. 

As mentioned earlier, the app has the option to temporarily request precise location from a user who has previously only granted approximate location. This request can be triggered when the user begins a task that requires fine location, such as wayfinding. 

However, there isn’t an explicit user action to trigger this temporary permission request when it comes to geofence notifications, so the temporary precise location prompt won’t help us here.  

In addition, geofence notifications require the Always Allow background location selection, so apps that promote this feature will feel the impact most.

Recommendations

  • Don’t request the user’s location until you need it.
  • Include a usage description clearly explaining why you need the user’s location.
  • Don’t request Always Allow permission unless you have a feature that requires the user’s location when the app is closed or backgrounded.
  • If you require precise location, but the user has only granted approximate location, use a one-time temporary precise location request.
  • If you require Always Allow + Precise location settings for Geofences, but the user hasn’t granted this, then to increase user acceptance, include a custom alert or screen informing the user the benefit of allowing this and provide instructions on how they can change this in iOS Settings, with a button that deep links them there. 
  • Remember, if the user chooses Don’t Allow, you won’t be able to request this permission again.

IDFA Permission Changes

The IDFA, or Identifier for Advertisers, is going to change as we know it. Ad agencies have relied on this device identifier for years to track users across apps and websites to learn their habits and interests so that they can target them with relevant ads. 

This was made more difficult with the release of iOS 10 when users could enable a Limit Ad Tracking setting, which would return all zeroes for this identifier. Before that, the only thing a user could do is reset their identifier value, but this was seldom used.

New: IDFA Prompt

iOS 14 brings the strongest changes to the IDFA yet, which may effectively kill it as the primary way advertisers track users. Rather than defaulting to having the IDFA available, developers will now have to prompt the user to allow access to the IDFA. 

The wording in the permission prompt will undoubtedly lead to a majority of users declining this permission: “App would like permission to track you across apps and websites owned by other companies.“

Like the Location Permission prompt, the IDFA prompt’s title is controlled by Apple, but the app developer configures the subtext. Developers will have to come up with a usage description convincing enough to persuade users to allow themselves to be tracked.

According to Apple, “The App Tracking Transparency framework is only available in the iOS 14 SDK. This means that if you haven’t built your app against iOS 14, the IDFA will not be available and the API will return all zeros.”

However, on September 3, 2020, Apple extended the deadline to 2021, by stating, “To give developers time to make necessary changes, apps will be required to obtain permission to track users starting early next year.“

New: Updated IDFA Settings

Also new in iOS 14 is a toggle in iOS Settings that, when disabled, prevents app developers from ever prompting the user for permission to use their IDFA. A user can find this in the iOS Settings app under Privacy → Tracking and applies globally to all apps. 

Impact

The most significant impact will be on the ad industry. Without a guaranteed way of tracking users across apps and websites, advertisers will need to rely on less tracking users’ ways of tracking. Since getting the user’s IDFA was never guaranteed, advertisers already have fallbacks methods for tracking users. Such methods include fingerprinting, where a collection of other information about the user, such as IP address, device model, and rough location, is used to verify that they are the same user. Another option is to use sampling since there will still be some users who allow themselves to be tracked. For example, if 5% of tracked users installed the app through a particular install ad, one can presume that about 5% of all users can be attributed to that campaign. 

Recommendations

  • Don’t request the user’s IDFA if your use case can be satisfied with the IDFV (Identifier for Vendor) instead. The IDFV is similar to the IDFA in the sense that it’s a unique identifier for the user. However, each app developer will be assigned a different IDFV per user, so this doesn’t help in tracking users across apps and websites by other developers. Since there are no privacy concerns, there is no permission prompt needed to obtain the IDFV and the user has no way to disable this.
  • Include a usage description clearly explaining why you’d like to track the user across apps and websites
  • Consider a custom prompt in advance of the official IDFA permission prompt to provide your users with more context before the scary system prompt is presented.
  • If a user declines the IDFA permission and you need to track them outside your app, use probabilistic methods such as fingerprinting or sampling.
  • Remember that if the user chooses Ask App Not to Track or if they disable the ability to prompt for this permission in Settings, then you won’t be able to request this permission. The best you can do at that point is to detect that they declined this permission, show some custom prompt, and direct them to the Settings app to enable the permission there.

Photo Permission Changes

Apple has required users to grant permission to their cameras or photos since iOS 8. However, this was an all-or-nothing permission, giving the developer access to all Photos. New in iOS 14 is the ability for users to choose if they want to grant access to all photos or only specific photos. 

New: Select Specific Photos

The initial photo permission prompt will ask the user if they would like to grant access to one or more specific photos, grant access to all photos, or decline this permission. A user who is simply trying to upload one specific photo may choose only to grant access to that photo. 

If a user only grants access to specific photos, the next time the app requests the photo permission, the user will receive a slightly different permission prompt. The new prompt will ask them if they would like to allow access to more photos or keep the current selection of photos they’ve previously allowed. 

New: Updated Photo Settings

Users can adjust their location settings in the iOS Settings app by navigating to Privacy → Photos  → App Name. Users can choose from the following options: Selected Photos, All Photos, or None. 

If Selected Photos is chosen, then an option to Edit Selected Photos appears. Tapping this presents a Photo Picker, which includes search functionality, the ability to view albums, and the ability to view previously selected photos. 

Note: The permission prompts and settings options only refer to photos. However, the same applies to videos.

Impact

This new privacy change should have minimal impact on apps that require the user to grant permission in order to upload specific photos or videos. The biggest impact will be on apps requiring permission to the entire camera roll, such as Google Photos. 

Recommendations

  • Don’t request photo access until the user is performing an action that requires this, such as uploading a photo or video.
  • Include a usage description clearly explaining why your app requires this permission.
  • Remember that these permission changes apply to videos as well.
  • Remember that if the user chooses Don’t Allow, you won’t request this permission again. The best you can do at that point is to detect that they declined this permission, show some custom prompt, and direct them to the Settings app to enable the permission there. 

Local Network Permission Changes

There are many legitimate reasons an app might need to use the local network. For example, it may connect to a printer, search for nearby players for a game, or control the lights in a home. 

At the same time, there are also less legitimate reasons that apps use the local network. They could be collecting information about the devices on the local network to create a “fingerprint,” which allows them to infer that a user is at home, even if without granting location permission.

In iOS 13, Apple required apps to request permission for access to Bluetooth. Now in iOS 14, they are doing the same for the local network. If your app communicates to devices over your home WiFi, for example, then it is operating over the local network and will trigger this new permission prompt. 

There are exceptions to system-provided services such as AirPrint, AirPlay, AirDrop, or HomeKit. These system services handle device discovery without exposing the full list of devices to the app, so they are exempt from triggering this permission prompt. 

Any other network connections outside the local network (e.g., Web Services, APIs, or other connections to the internet) are not impacted and do not require permission.

New: Local Network Prompt

When an app tries to connect to the local network, it will trigger a Local Network Permission Prompt even if only to view available devices.

Impact

Many applications use the local network for use cases other than the system services previously mentioned. We’ve found that most streaming apps trigger this permission prompt upon launch, likely because they support Google Cast. There may be apps that have Analytics SDKs that collect this type of information. Those apps will also display this prompt upon app launch. 

Recommendations

  • Add logic to defer this permission prompt until the user performs an action that requires it, such as searching for nearby players or casting a video.
  • Include a usage description clearly explaining why your app needs to use the local network.
  • Remember that if you change nothing before iOS 14 release date and your app uses the local network, this permission prompt will be one of the first things the users see when they launch your app on iOS 14. 
  • Remember that if the user chooses Don’t Allow, you won’t request this permission again. The best you can do at that point is to detect that they declined this permission, show some custom prompt, and direct them to the Settings app to enable the permission there. 

Other Privacy Changes

New: Microphone/Camera Indicator

iOS 14 will display a colored dot in the status bar, indicating the current app is actively using the microphone or camera. Be careful not to use any low-level camera/microphone APIs unless the user is performing an action to capture audio or video. 

New: Pasteboard Alerts

iOS 14 will display a banner at the top of the screen, indicating the current app has just extracted the contents of the user’s Pasteboard (also known as clipboard). Some apps use the pasteboard to detect copied URLs to surface the right information when the user moves to their native app. 

Be careful with any Analytics SDKs you include in your app that may be collecting this user data.

More Info

For a quick and easy reference to the iOS 14 permission changes discussed in this blog, download our Location Cheat Sheet:
Download The Location Cheat Sheet

WWDC 2020 Videos

At Phunware, our Engineering team is dedicated to staying up-to-date with the latest changes from Apple WWDC and Google I/O. If you’re a Product Manager looking for a Location or Analytics SDK built by a team that understands these privacy-related changes, then visit our Products page for a complete list of our software products, solutions, and services.

The post Navigating Permission Changes in iOS 14 appeared first on Phunware.

]]>
SwiftUI: A Game Changer http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/swiftui-a-game-changer/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/swiftui-a-game-changer/#comments Wed, 17 Jul 2019 16:09:07 +0000 http://127.0.0.1/blog/the-power-of-machine-learning-on-a-user-device-copy/ Last month at WWDC 2019, Apple released a heap of information to continue building on their software platforms. This year’s event was jam packed with new features such as user profiles on tvOS, standalone AppStore on watchOS and dark mode on iOS. Also announced was the stunning Mac Pro and Pro Display which is a […]

The post SwiftUI: A Game Changer appeared first on Phunware.

]]>
Last month at WWDC 2019, Apple released a heap of information to continue building on their software platforms. This year’s event was jam packed with new features such as user profiles on tvOS, standalone AppStore on watchOS and dark mode on iOS. Also announced was the stunning Mac Pro and Pro Display which is a powerhouse of a machine that can tackle extreme processing tasks.

Apple has a recurring theme of releasing mind-blowing features, but nothing was more exciting than the announcement of SwiftUI. As Apple’s VP of Software Engineering, Craig Federighi, announced the new UI toolkit, it felt like a metaphorical bomb dropping in the middle of the room!

Shortly after a quick SwiftUI overview, the keynote was over. Developers were left excited, stunned and filled with hundreds of questions about the new UI framework. It felt like the only thing missing from the SwiftUI announcement was the iconic “One More Thing” introduction slide Steve Jobs was known for using.

The blog explains what SwiftUI is, the benefits of using SwiftUI compared to the current UI programming method and how SwiftUI handles data management.

SwiftUI and Declarative Programming

Let’s take a step back and look at what makes this UI toolkit exciting. SwiftUI let developers build the designs for their apps in a new declarative way. Native iOS developers have only known how to build and maintain their UI through imperative programming. Imperative programming requires the user to maintain every UI state themselves and update each item to keep it in sync with their data models. As your UI elements increase, so does the complexity of your logic management, leading to state problems.

With declarative programming, the developer sets the rules that each view should follow and the framework makes sure those guidelines are enforced. As the user interacts with your UI and your data model changes, the view will rebuild itself to reflect those changes automatically. This vastly reduces code complexity and allows developers to create robust user interfaces with fewer lines of code. Other development frameworks, such as ReactNative and Flutter have already been using this declarative UI paradigm, and developers love how quickly they can put together the UI and how this produces easy to read code.

But the declarative framework is only part of the story. SwiftUI brings even more enhancements to iOS programming, such as live previews in Xcode, drag and drop programming and cross-platform development.

Overview of SwiftUI

In order to display the simplicity and beauty of SwiftUI, I think it’s worth seeing a small sample of code. Let’s think about a single view app that contains a table view. This is a view that iOS developers have programmed countless times. You immediately think of adding a UITableView through Interface Builder or programmatically, then assign its datasource and delegate to your ViewController. You then need to add the required datasource and delegate functions to fill the content of the table view. Before you know it, this simple table view is up to 30 lines of code.

Here’s the Swift code for a basic table view that displays a list of country names:

class MasterViewController: UITableViewController {
    var countries: [Country] = fullCountryList
 
    override func viewDidLoad() {
        super.viewDidLoad()
    }
 
    // MARK: - Table View
    override func numberOfSections(in tableView: UITableView) -> Int {
        return 1
    }
 
    override func tableView(_ tableView: UITableView, numberOfRowsInSection section: Int) -> Int {
        return countries.count
    }
 
    override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
        let cell = tableView.dequeueReusableCell(withIdentifier: "Cell", for: indexPath)
 
        let country = countries[indexPath.row]
        cell.textLabel?.text = country.name
        return cell
    }
}

Now we can take a look at the code needed to create that same table in SwiftUI:

struct MyTableView : View {
    @State var countries: [Country] = fullCountryList
 
    var body: some View {
        List(countries) { country in
            Text(country.name)
        }
    }
}

Believe it or not, the part of that code that actually displays the table view is the 3 lines of code inside the body computed variable, and that includes the closing bracket. The List struct knows to infer the count and can adjust its cell to display the text.

You’ll notice that MyTableView is of type View. In SwiftUI, a View is a struct that conforms to the View protocol, rather than a class that inherits from a base class like UIView. This protocol requires you to implement the body computed variable, which simply expects a View to be returned. Views are lightweight values that describe how you want your UI to look and SwiftUI handles actually displaying UI on the screen.

Using Xcode 11 and SwiftUI, you now have the canvas on the right panel which shows you a live preview of your code. This preview is created by the PreviewProvider block of code that is automatically added with each new View you create. The beauty of this preview is that it refreshes itself as you make changes to your code without having to build and run with each change.

This will surely decrease development time as you no longer have to compile your entire project to check your minor UI adjustments while working to make your app design pixel perfect to the design specs.

Data Management with SwiftUI

This only scratches the surface of what SwiftUI brings to iOS development. SwiftUI is easy to use but there are advanced features that allow you to take your app to the next level. Developers will want to dive deeper into how data is managed within SwiftUI. To keep your data and UI in sync, you will need to decide which views will maintain the “source of truth” for your app and which views will simply be passed as reference data.

Let’s imagine we’re developing a media player and working on the Player screen. This will have many UI elements, but we’ll simplify it to the play/pause button and a progress view.

Here’s a rough model:

Here you have the PlayerView with smaller SwiftUI views to maintain the PlayButton and ProgressView. Each SwiftUI view will need the isPlaying attribute to know how to update its own UI state, but if each view is maintaining its own value, this could cause state problems.

Instead, we want there to be a “master” isPlaying attribute that all the SwiftUI views can read and react to. Here’s a better model:

The parent PlayerView will hold the master isPlaying attribute and the child views will only reference this variable. When the user interacts with the child UI elements to manipulate the isPlaying boolean, those changes will make their way through the views that are associated with the variable.

Let’s take a look at what this looks like in our code:

struct PlayerView : View {
    let episode: Episode
    @State private var isPlaying: Bool = false
 
    var body: some View {
        VStack {
            Text(episode.title).foregroundColor(isPlaying ? .white : .gray)
 
            PlayButton()
        }
    }
}

This SwiftUI PlayerView is a vertical StackView that has a Text label with the show title and a PlayButton View.

Swift 5.1 will introduce Property Wrappers, which allow SwiftUI to use the keyword @State and @Binding to add additional logic to your view’s variables. In the code above, the PlayerView is the owner of the isPlaying attribute so we indicate this with the @State keyword.

struct PlayButton : View {
    @Binding var isPlaying: Bool
 
    var body: some View {
        Button(action: {
            self.isPlaying.toggle()
        }) {
            Image(systemName: isPlaying ? "pause.circle" : "play.circle")
        }
    }
}

Now looking at the PlayButton code, we have the isPlaying boolean here as well, but we added the @Binding keyword to tell this View that this variable is bound to a @State attribute from a parent view.

When a parent view calls a child view, they can pass the State variable to the Binding variable as a parameter into the View and use the “$” prefix:

struct PlayerView : View {
    let episode: Episode
    @State private var isPlaying: Bool = false
 
    var body: some View {
        VStack {
            Text(episode.title).foregroundColor(isPlaying ? .white : .gray)
 
            PlayButton(isPlaying: $isPlaying)
        }
    }
}

By doing this, when a binding variable is changed by some user interaction, the child view sends that change through the entire view hierarchy up to the state variable so that each view rebuilds itself to reflect this data change. This ensures that all your views maintain the same source of truth with your data models without you having to manage each view manually.

This is a high level introduction to data management with SwiftUI. I encourage you to dig further into this topic by watching the WWDC tech talk, Data Flow Through SwiftUI.

Start Working with SwiftUI

The best way to grow your knowledge of SwiftUI and learn its more advanced functions is to start using it to build an app. The great news is that you don’t have to build an entire app from scratch in order to use SwiftUI. Apple provided classes and protocols that allow you to integrate newly designed SwiftUI views into your existing projects.

So the next feature you work on for your iOS, watchOS or tvOS project, consider developing one of the views in SwiftUI and integrate it into your project.

If you want to keep digging into SwiftUI, check out these WWDC Tech Talks and Tutorials:

Here at Phunware, our architects and developers stay up-to-date with the latest changes from Apple WWDC and Google IO. If you’re interested in joining the Phamily, check out our latest job openings. We’re currently looking for Android and iOS software engineers!

The post SwiftUI: A Game Changer appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/swiftui-a-game-changer/feed/ 1
The Power of Machine Learning on a User Device http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/the-power-of-machine-learning-on-a-user-device/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/the-power-of-machine-learning-on-a-user-device/#respond Tue, 02 Jul 2019 21:34:35 +0000 http://127.0.0.1/blog/why-are-brands-afraid-mobile-games-copy/ Until recently, using machine learning inside your products was not a small task. It required a data center with servers running all the times: dedicated space, memory and bandwidth. Now, using the power of machine learning, we can make new, empowering features directly on a user’s device. Today, we’re showing you how easy it can […]

The post The Power of Machine Learning on a User Device appeared first on Phunware.

]]>

Until recently, using machine learning inside your products was not a small task. It required a data center with servers running all the times: dedicated space, memory and bandwidth. Now, using the power of machine learning, we can make new, empowering features directly on a user’s device.

Today, we’re showing you how easy it can be to run your own machine learning on a user device. In our step-by-step tutorial, we’re going to go from getting your data, to training your model on a Mac, to running an iOS app with your newfound powers. Read on for instructions!

Rise of Accessibility for Machine Learning

New tools are making machine learning opportunities more and more accessible. Apple has CoreML, a powerful framework optimized for Apple hardware. And Google has TensorFlow Lite models that are made to fit on phones. Both Apple and Google, at their respective annual conferences, dedicated a significant amount of time talking about how they’ve benefitted from moving machine learning to users’ devices, and how they’re empowering developers on their platforms to do the same. With machine learning on your device, you could add these features through your app:

  • Voice control
  • Facial recognition through an app
  • Offline chatbots to assist with FAQs or onboarding
  • Decipher text from signs for accessibility
  • Scan and store text from business cards or important documents
  • Translate text
  • Recognize objects like cars and identify their make/model/type
  • Convenient typing predictions
  • Keyboards that autocomplete your writing in the style of a famous author
  • Add never-before-seen filters to images
  • Tag photos and videos according to who or what is in them
  • Organize emails and messages by what is most important to you

Advantages of Machine Learning

  1. It’s scalable. As the number of users of your app grows, you don’t have to worry about more traffic with the server, or Internet connection points of failure. You don’t need to get extra memory and storage. And users avoid bandwidth issues because they don’t have to ping the Internet all the time
  2. It’s fast. You’re not hindered by internet latency because you are using hardware that is optimized for machine learning.
  3. It’s private. Your users can be rest assured knowing the information being analyzed is all private. You are not handling their data; everything is happening on their devices at their behest.

That said, there are still costs associated with machine learning. For example, creating the models that will be used on device still requires and depends on massive amounts of quality data and high powered machines. Yet even these features are becoming more readily available and easy to use.

Interested in seeing just how easy it can be? Follow our tutorial below!

Before Getting Started.

  • It will be helpful to know a tiny bit of iOS development, including how to run an app on the simulator through Xcode.
  • Also, familiarity with Swift Playgrounds is helpful but not required.
  • Other than that, we’ll take you through the machine learning process one step at a time.

You can find the full code you’ll be writing at the end of this blog post.

Step 1: Getting the Data.

This tutorial focuses on a kind of machine learning called natural language processing (NLP) – which essentially means, “making sense of words.” Specifically, we will be doing a sentiment analysis. This is where we take a word or phrase and decide what feeling is associated with it. Great use cases for this functionality include marketing analysis of customer feedback, evaluating tester interviews for product design, or getting the lay of the land with comments left on user reviews of a product.

Let’s say you want to use sentiment analysis to organize or display messages on your new messaging app, or your new email client. You can group them by tone, or color-coordinated messages to give the user a heads up of what’s coming, or help them decide what they should answer right away, or whatever else you can imagine as a helpful feature. (And again, we can do all this by offloading the processing power and smarts to the users device without compromising other features users want, like end-to-end encryption).

First though, you’ll need to get the data. Ours will come as a CSV. Most major spreadsheet programs can open a CSV, so you can easily see what the data looks like.

DOWNLOAD SAMPLE CSV

As with any data, we want to be transparent with where we got our information. I’ve cleaned up the linked dataset, but the basics of it come courtesy of work done for this paper:

Maas, A., Daly, R., Pham, P., Huang, D., Ng, A. and Potts, C. (2011). Learning Word Vectors for Sentiment Analysis: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. [online] Portland, Oregon, USA: Association for Computational Linguistics, pp.142–150. Available at: http://www.aclweb.org/anthology/P11-1015.

This dataset is basically the CSV form of a simple spreadsheet with two columns.

  • One is labeled “sentiment” and is a column with values of either “Positive” or “Negative”. You may see this in other data sets as 1 for positive and 0 for negative, but for coding purposes we need to format as words instead of integers.
  • The other column is the text of the review, and it is labeled “review” at the top. And there are 25,000 reviews! Go ahead and import this into a spreadsheet to see what it looks like.

This type of machine learning is known as classification and we’ll be making a classifier. The reviews are your “x” inputs, or features, and the “Negative”/“Positive” values – known as labels – are like the “y” values you get as output. Your target prediction is a “Negative” or “Positive” value.

Alright. So if you have downloaded the data, now it’s time to write some code to train the model.

Step 2: Training the Model

Training a model means giving our program a lot of data so that it learns what patterns to look for and how to respond. Once the model is trained, it can be exported as a file to run on a device. That means you’re not taking all those gigabytes of training data with you.

It’s sort of like pouring lots of water over a material to make a sculpture that has the right shape. Our training data is the water. The sculpture is the model. It’s what we’ll use once it is trained and in the right shape.

For this example, we’ll use an Xcode Playground, which is like a blank canvas that runs code and is very useful for experimenting.

  1. Open up Xcode, preferably Xcode 10.2 or later. Your version of iOS should be at least iOS 11. In Xcode go to File > New > Playground. Use macOS as the template, and choose “Blank” from the options. Then click “Next.”
  2. Now it will ask you where to save the project and what to call it. I called mine “CreateMLTextClassifier”.
  3. Save your Playground. It will open up with some boiler plate code. Delete all of that code.

The full code for the playground is available at the end, but we’ll also take you step-by-step.

First we’ll import the frameworks we’ll need at the very top. Add this:

import CreateML
import Foundation
import PlaygroundSupport

Then we’ll create a function that will do the actual magic. Below your import statements, write:

func createSentimentTextClassifier() {
 
}

Now we’ll fill out this function. Write everything in between the brackets until told otherwise. The first thing you’ll write inside the brackets are:

// Load the data from your CSV file
let fileUrl = playgroundSharedDataDirectory.appendingPathComponent("MovieReviewTrainingDatabase.csv")

So we have this line, but in order to make it actually work, we’ll need to set up a folder with our CSV in the right location. What’s happening here is that the Playground is looking for a folder called “Shared Playground Data”. So go ahead and make a folder with that name in your “Documents“ directory, and then add the “MovieReviewTrainingDatabase.csv” to that folder. Now the Playground can find it!

Back to coding. Below the fileUrl lines you just wrote, add:

guard let data = try? MLDataTable(contentsOf: fileUrl) else {
return
}

This takes the CSV file and converts it to a table format that the program knows how to handle better for machine learning.

Next, below the “guard let data …” lines you wrote, write:

// Split the data for training and testing
let (trainingData, testingData) = data.randomSplit(by: 0.8, seed: 5)

This will give you data for training, and testing. This will train the data with 80 percent of what’s in the CSV (that’s what the 0.8 means) and the other 20 percent will be used later. So it will go over and over the training data, now the testing data, which the classifier has never seen, can tell us how well the data would have done in the real world.

As a side note, it’s possible to train your machine learning model so many times on the same data that you “overfit” your model. This means it’s great at working with the training data, but it may not be great at generalizing outside that data. Imagine a facial recognition system that easily identifies my face, but when shown a new face it cannot recognize that it is even a face because it had only ever seen my face. Sort of like that.

Now, below the “trainingData, testingData” lines you wrote, write:

// Make the model
guard let sentimentClassifier = try? MLTextClassifier(trainingData: trainingData, textColumn: "review", labelColumn: "sentiment") else {
return
}

This creates the untrained classifier and gets it ready with the trainingData we made earlier. CoreML already has something called an MLTextClassifier which is specifically meant for this kind of use. So we tell it that the column of our spreadsheet/CSV with our text is the column with “review” written at the top, and the “labelColumn” which will become the labels we’re trying to predict, are in the “sentiment” column of our spreadsheet/CSV.

Now below the previous lines write:

// Training accuracy percentage
let trainingAccuracy = (1.0 - sentimentClassifier.trainingMetrics.classificationError) * 100
print("Training accuracy: \(trainingAccuracy)")

This will let us know during training how accurate our model is getting. It should start small, guessing 50 percent, and then grow to high 90s.

Now below the previous lines write:

// Validation accuracy percentage
let validationAccuracy = (1.0 - sentimentClassifier.validationMetrics.classificationError) * 100
print("Validation accuracy: \(validationAccuracy)")

This tells us about how our validation is going. We have already divided the data between training and testing. Within testing, there is another process of dividing the data between training and validation, so that the data is trained a bunch, but when it comes time for fresh data before going over another cycle of training, we check the validation. It’s yet another standard step that helps avoid overfitting and other such problems.

Now below the previous lines write:

// Testing accuracy percentage
let evaluationMetrics = sentimentClassifier.evaluation(on: testingData)
let evaluationAccuracy = (1.0 - evaluationMetrics.classificationError) * 100
print("Evaluation accuracy: \(evaluationAccuracy)")

This finally tells us how accurate our testing data is after all of our training. It’s the real-world example scenario.

Now below the previous lines write:

// Add metadata
let metadata = MLModelMetadata(author: "Matthew Waller", shortDescription: "A model trained to classify the sentiment of messages", version: "1.0")

This is just metadata saying who made the model, a description, and the version.

And the last part of the function, below the previous lines, is:

// Export for use in Core ML
let exportFileUrl = playgroundSharedDataDirectory.appendingPathComponent("MessageSentimentModel.mlmodel")
try? sentimentClassifier.write(to: exportFileUrl, metadata: metadata)

This exports the model so we can drop it in for use in our app.

Now that you’ve made your function you’re ready to run it!

Below the brackets of the function write:

createSentimentTextClassifier()

Now run the Playground! It may automatically run, or you can press the play icon in the lower left corner.

You should see things like the training, validation, and evaluation accuracy pop up in the console. After everything was parsed and analyzed, my training took 8 seconds. My training accuracy was 100.0, and validation and test data evaluation were at around 88 and 89 percent, respectively.

Not a bad result! Even this tutorial on deep learning, a subset of machine learning, using a modest LSTM (“Long Short-Term Memory”) neural net got about 87 percent accuracy on the test data.

With less than 50 lines of code and about 8 seconds of training, we’ve analyzed 25,000 movie reviews and exported a machine learning model for use. Pretty awesome.

Step 3: Putting Machine Learning to Work

It’s time to get the app ready to use our new model.

I’ve made a skeletal app where we can enter some text, and then automatically evaluate it as positive or negative. With that basic feature up and running, you can imagine entering text from any source, knowing how to classify it, and then presenting it in the right way for the convenience of your user. (And in the future, if you have the labeled data, you could do things like determine whether something is or is not important, or divide text into more categories other than just “Positive” or “Negative”.) The project is available on GitHub.

VIEW GITHUB PROJECT

Once you’ve cloned or downloaded the project, open the project in Xcode. Next open a Finder window for the Shared Playground Data folder you created. Next, drag and drop the “MessageSentimentModel.mlmodel” file you created through the Playground into the Xcode project just below the ViewController.swift file.

When it asks you how you want to import it, check all the checkboxes and use “Create Groups” from the radial options.

Now you’re ready to add the code to make the model work.

Go to the ViewController.swift file, and below “sentimentLabel” add:

let sentimentModel = MessageSentimentModel()

Next uncomment the code in “checkImportanceTapped(_ sender: UIButton)”

So with this line:

guard let languageModel = try? NLModel(mlModel: sentimentModel.model) else {
return
}

This wraps our model in an even easier-to-use framework so that we can take the user’s input and update the text of the sentimentLabel in one line, like so:

sentimentLabel.text = languageModel.predictedLabel(for: text)

And it’s as simple as that!

Now let’s run it.

If we type in “I’m doing well” I get the label “Positive” at the bottom. So far so good!

And “I had a really bad day” is …

And now, we’re off to the races! Play around with it yourself!

I hope you’ve enjoyed this demonstration and primer on machine learning, and can imagine the potential of running AI on device. At Phunware, we’re always working for better quality code. That means figuring out how to apply the latest technologies (such as data binding) to challenging, often high-profile projects. In fact, Phunware’s Knowledge Graph uses machine learning and proprietary algorithms to curate over five terabytes of data every day from approximately one billion active devices each month. This data is then used to provide intelligence for brands, marketers and media buyers to better understand their customers, engage and acquire new customers, and create compelling user experiences.

Feel free to reach out with any questions about the myriad possibilities around mobile (or any sized screen) in this field or others. Thank you for reading!

Interested in joining the Phamily? Check out our latest job openings. We’re currently looking for Android and iOS software engineers!

Full Playground code:

import CreateML
import Foundation
import PlaygroundSupport 
 
func createSentimentTextClassifier() {
// Load the data from your CSV file
let fileUrl = playgroundSharedDataDirectory.appendingPathComponent("MovieReviewTrainingDatabase.csv")
 
guard let data = try? MLDataTable(contentsOf: fileUrl) else {
return
 
// Split the data for training and testing
let (trainingData, testingData) = data.randomSplit(by: 0.8, seed: 5)
 
// Make the model
guard let sentimentClassifier = try? MLTextClassifier(trainingData: trainingData, textColumn: "review", labelColumn: "sentiment") else {
return
}
 
// Training accuracy percentage
let trainingAccuracy = (1.0 - sentimentClassifier.trainingMetrics.classificationError) * 100
print("Training accuracy: \(trainingAccuracy)")
 
// Validation accuracy percentage
let validationAccuracy = (1.0 - sentimentClassifier.validationMetrics.classificationError) * 100
print("Validation accuracy: \(validationAccuracy)")
 
// Testing accuracy percentage
let evaluationMetrics = sentimentClassifier.evaluation(on: testingData)
let evaluationAccuracy = (1.0 - evaluationMetrics.classificationError) * 100
print("Evaluation accuracy: \(evaluationAccuracy)")
 
// Add metadata
let metadata = MLModelMetadata(author: "Matthew Waller", shortDescription: "A model trained to classify the sentiment of messages", version: "1.0")
 
// Export for use in Core ML
let exportFileUrl = playgroundSharedDataDirectory.appendingPathComponent("MessageSentimentModel.mlmodel")
try? sentimentClassifier.write(to: exportFileUrl, metadata: metadata)
}
 
createSentimentTextClassifier()

The post The Power of Machine Learning on a User Device appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/the-power-of-machine-learning-on-a-user-device/feed/ 0
Phunware Team Takeaways from Google I/O 2018 http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/phunware-takeaways-google-io-2018/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/phunware-takeaways-google-io-2018/#respond Wed, 30 May 2018 16:01:10 +0000 http://127.0.0.1/blog/phunware-takeaways-google-io-2017-copy/ The world was watching earlier this month as Google CEO Sundar Pichai demonstrated a world first: a very realistic phone call made by a Google Assistant, booking a hair salon appointment on behalf of its “client”. While this moment quickly made headlines, it was only the beginning of three days of debuts, announcements and presentations […]

The post Phunware Team Takeaways from Google I/O 2018 appeared first on Phunware.

]]>
The world was watching earlier this month as Google CEO Sundar Pichai demonstrated a world first: a very realistic phone call made by a Google Assistant, booking a hair salon appointment on behalf of its “client”. While this moment quickly made headlines, it was only the beginning of three days of debuts, announcements and presentations across the world of Google.

I asked the team to weigh in on the highlights from this year while the excitement is still fresh in our minds. From new features to on-site sessions, we’ve covered quite a bit of ground. Here’s what you need to know, from our team to yours, about the future of Android as shown at Google I/O 2018.


The new Material Sketch plugin, demonstrated.

“I enjoyed the inspirational sessions this year, especially ‘Designing for Inclusion: Insights from John Maeda and Hannah Beachler.‘ Seeing two leaders in the design field talk about their experiences and take on the industry was motivational. I am also excited about the new material theming as part of Material Design 2.0 as it enables us to push Android design and brands to better align with their brand guidelines.”

—Ivy Knight, Senior UX/UI Designer


Slices, demonstrated.

“I am really excited about the Navigation library and Slices. Navigation will eliminate a ton of brittle code that we commonly write for Android apps, and I am looking forward to updating Phunware’s App Framework to integrate with it. Slices is really interesting, as it will help our users re-engage with apps that they may have forgotten about. It also enables some really cool use cases such as searching for a doctor’s name and being able to offer the user routing straight to that doctor in a hospital app.”

—Nicholas Pike, Software Architect Android , Product VS


Alex Stolzberg & Nicholas Pike

“I was really excited about the new WorkManager that allows for easy background processes to be performed. You can also easily chain and sequence jobs to make the code very clean for a large amount of processes rather than having a cumbersome nested callback structure, reducing the possibility for bugs when writing features or making changes later on.”

—Alex Stolzberg, Software Engineer Android, Product


L to R, Nicholas Pike, Jon Hancock and Ian Lake. Ian is a former Phunware employee turned Googler who stays involved both with his former coworkers and the larger developer community.

“I’m very excited that Google is taking an opinionated stance on development architectural patterns. Writing apps for Android has been a wild west for years, and having some direction and guidance directly from Google will result in new Android developers entering the field with a greater understanding of how to build complete, stable apps. When those new developers find their first jobs, they’ll be more likely to be comfortable and ready to contribute quickly.”

—Jon Hancock, Software Architect Android


Ram Indani and Romain Guy, an Android Graphics Team Manager, at the in-person app review session.

“I really liked the app review sessions. It shows that Google cares about the applications being developed and is willing to work with the community to improve them. Feedback received from the reviewers is valuable and they ensured that the Googler reviewing the app had expertise in the apps they were reviewing.

—Ram Indani, Software Engineer Android


L to R, Alex Stolzberg, Ram Indani, Nicholas Pike and Duong Tran.

“I am excited about what the new Cast Application Framework has to offer. Some of the benefits of the new framework include simpler code implementation and ad integration as well as enhance performance and reliability. Also, new features such as Google Assistant with voice command are automatically included. I was amazed by the Cast Application Framework team’s willingness to work with developers to create unique solutions for our client’s business requirements, such as providing a custom framework and unique branding options.”

—Duong Tran, Software Engineer Android


Want to stay up to date on the latest and greatest in mobile news? Subscribe to our monthly newsletter!

SUBSCRIBE TO THE NEWSLETTER

The post Phunware Team Takeaways from Google I/O 2018 appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/phunware-takeaways-google-io-2018/feed/ 0
Going Worldwide: 7 App Localization Tips for Android Devs http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/going-worldwide-7-tips/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/going-worldwide-7-tips/#respond Wed, 09 May 2018 13:00:30 +0000 http://127.0.0.1/?p=29662 (Originally published on May 25, 2017) Here at Phunware, we are dedicated to making accessible and beautiful apps for an international audience, spanning all languages and locales. We also promote continuous learning for our developers and sharing our knowledge with the developer community. With that purpose in mind, I’d like to pass along a few […]

The post Going Worldwide: 7 App Localization Tips for Android Devs appeared first on Phunware.

]]>
(Originally published on May 25, 2017)

Here at Phunware, we are dedicated to making accessible and beautiful apps for an international audience, spanning all languages and locales. We also promote continuous learning for our developers and sharing our knowledge with the developer community. With that purpose in mind, I’d like to pass along a few tips and tricks I learned at DroidCon Boston 2017 that make it easier to adapt your Android apps to reach more users worldwide.

1. Apply automatic right-to-left (RTL) layout support

Languages like Arabic, Hebrew and Persian are written from right to left, which requires a different layout from left-to-right languages like English and Spanish. With the newer Android SDKs, you can skip the step of providing those RTL layouts separately.

MinSDK 17+ automatically updates layouts for RTL with the following:

  • In your AndroidManifest.xml, specify supportsRtl = true.

    • With this setting, Android will update layouts for RTL automatically when the system language calls for it.
  • Use start and end in layout attributes rather than left and right to get the appropriate automatic RTL changes.
  • Remember to add margins on both sides / ends of your layouts.

MinSDK 19+ automatically handles RTL mirroring of vector icons with the automirrored attribute:

  • Define which icons should or should not be mirrored with RTL (for example, the search icon).
  • Reference the Material Design docs for suggestions on what should or should not be mirrored.

2. Prevent grammar issues by using strings.xml with placeholders for arguments instead of concatenating strings

Because grammar is different from language to language, developers cannot assume sentence structure. Rather than concatenating strings together, use placeholders in strings.xml for arguments (Ex: %1$s, %2$d), so your translation service can specify the grammar properly. Also, make sure your translation service understands that they should leave these placeholder values untouched.

  • To help translators understand placeholder values:
    • Specify a placeholder argument name (id="color").
    • Provide a placeholder example (example="blue").

3. Use <plurals> to handle one-to-many results

This little trick will save you time and hassle (it’s also a suggested Android practice), and it makes for cleaner code. Here’s how it looks:

  • Warning: some languages do not have the concept of plurals. You will have to adjust your plural definitions for those languages accordingly.

4. Speaking of strings, avoid using Spannables to format strings that will be localized

Again, since sentence structure and grammar can change from language to language, the placement of the formatted part of the string might not necessarily be where you’d expect. If you must use a Spannable, don’t use hardcoded indices to format characters (bold, italic, etc.)—you might just BOLD something that makes no sense at all. Instead, programmatically find the parts of the string to format the characters.

Instead of Spannables, you can use:

  • HTML formatting in strings.xml (ex: <b>Hello</b>)
  • Html.fromHtml(String text)

5. Use the “German Test” to check text bounds for truncation or bad layouts

Sometimes, localized text can extend beyond the bounds of your layouts—not good. To check for this, use German. It’s a useful test language for this issue because English-to-German translations result in text expansion of up to 20%, with compound words replacing multiple-word English phrases. At the same time, German uses relatively few special characters, so you’ve got a relatively “pure” test for text bounds.

6. Use the Fastlane Screengrab tool to streamline localization QA

This new tool automates the capture and collection of screenshots across each localized screen in your app, uploading each one to a folder where QA can easily compare and verify each version. Here’s how to use it:

  • First, write espresso tests to go through each screen in your app.
  • Then, set up Fastlane Screengrab to take a snapshot of each screen the tests go through and upload to a folder (it can take in several languages, and run against many devices).
  • Finally, compare and verify screenshots.


(Image source: Fastlane Github.)

7. Use Fastlane Screengrab and Supply to localize on the Google Play Store

Gather the appropriate screenshots with Fastlane Screengrab, then use Fastlane Supply to push up your store metadata, screenshots and .apks quickly and easily. Use Timed Publishing mode so you can review and make changes before final upload. And don’t forget the Google Play character limits for your app listing. (You might want to create a script to count characters and verify that they are within the store limits.)

Finally, here are a few reminders for any developers working on app internationalization and localization:

  • Many languages use special characters that don’t appear in English, so make sure the fonts that you support can handle any special characters needed (not all of them can).
  • Default strings must always be defined in the values/strings.xml file.
  • Watch out for special characters in your strings.xml that must be escaped (Ex: \', \").
  • Keep an eye out for these important Lint warnings:
    • Extra translation (Too many translations)
    • Incomplete translation (Missing translations)
    • Inconsistent number of placeholders (more placeholder arguments in one translation versus another)

I enjoyed sharing these tips with the rest of the Phunware development team and I hope they’ll prove just as useful for you. Want to join us? Phunware is always looking for curious and creative developers who want to work at a company where mobile is top priority. Check out our open positions and let’s get busy changing the world.

This blog post was made with the permission of Phil Corriveau (Intrepid), who presented the class Bonjour, Monde: Optimizing Localization at DroidCon Boston 2017.

Want to learn more? Subscribe to our newsletter for monthly updates on mobile technology, strategy and design.

SUBSCRIBE TO OUR NEWSLETTER

The post Going Worldwide: 7 App Localization Tips for Android Devs appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/going-worldwide-7-tips/feed/ 0
Android Data Binding with RecyclerViews and MVVM: a Clean Coding Approach http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/android-clean-coding-approach/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/android-clean-coding-approach/#comments Mon, 08 Jan 2018 20:58:17 +0000 http://127.0.0.1/?p=31716 When users open an Android app, what they see is the result of Android developers assigning data from various inputs (databases, the internet, etc.) to elements of the app user interface. Until 2015, the process of assigning (or “binding”) data toxx UI elements was tedious and potentially messy. During its I/O developer conference that year, […]

The post Android Data Binding with RecyclerViews and MVVM: a Clean Coding Approach appeared first on Phunware.

]]>
When users open an Android app, what they see is the result of Android developers assigning data from various inputs (databases, the internet, etc.) to elements of the app user interface. Until 2015, the process of assigning (or “binding”) data toxx UI elements was tedious and potentially messy. During its I/O developer conference that year, however, Google demonstrated its Data Binding Library, which gave developers the ability to streamline and clean up the process in many ways.

When the Library was Beta-released later that fall, I was eager to learn more about Android data binding and its applications, but it was still in its infancy and Google’s disclaimer warned against trusting it in any released app. Fast forward two years to today, and the Android Data Binding Library has matured considerably. The disclaimer is now gone, and I recently began exploring data binding in my daily development work.

Like any good Android developer, one of my main goals is to write clean code, code that “never obscures the designer’s intent but rather is full of crisp abstractions and straightforward lines of control,” as author Grady Booch put it. I have found that using data binding with the Model-View-ViewModel (MVVM) architectural pattern and RecyclerView accomplishes many of the objectives of clean coding, including reducing the requirement for boilerplate code, facilitating code decoupling and improving readability and testability—not to mention reducing development time.

Unfortunately, Google’s examples of using data binding in Android apps are rather simplistic and lack detail. So let’s explore the necessary steps to set up a project with data binding, a RecyclerView and MVVM—and write clean code in the process.

A Quick MVVM Primer / Refresher

MVVM is an architectural pattern that was created to simplify user interface programming. Google appears to be encouraging the use of MVVM for data binding. In fact, the Architecture Components of its Data Binding Library are modeled on the MVVM pattern.

The main components of MVVM are the Model, View and ViewModel, and its structure essentially supports two-way data binding between the latter two.

  • The View defines the user interface structure, layout and design and consists of views, layouts, scroll listeners and so on. It also notifies the ViewModel about different actions.
  • The ViewModel serves as the intermediary between the View and the Model. It provides data to the View via bindings and handles View logic. It calls methods on the Model, provides the Model’s data to the View and notifies the View of updates.
  • The Model is the data domain model and the source of application logic and rules. It provides data to the ViewModel and can update the ViewModel using notification mechanisms such as data access objects, models, repositories and gateways.

As you can see, the View knows about the ViewModel and the ViewModel knows about the Model. The Model, however, doesn’t know about the ViewModel and the ViewModel doesn’t know—or care—about the View. This separation enables each component to grow independently, and this design pattern makes the user interface distinct from the corresponding business logic. The result is easier application development, testing and maintenance.

Data Binding with MVVM and RecyclerView

Follow the steps below to set up Android data binding using MVVM and RecyclerView.

1. Update the Gradle File(s)

The first step in adding data binding to a project is changing the module’s build.gradle file(s). Recent updates to the Android Data Binding Library have enabled easier data binding by adding a data binding closure to the Android closure, and because data binding is included in Google’s Application and Library plugins you no longer need to add a dependency. Instead, use the following closure:

2. Prepare Your Tags

To use data binding in Layout Files, you must wrap the normal View Groups or Views in <layout> tags containing data tags with variables for bindable methods and binding adapters. Bindable methods are typically referenced with app:data="@{viewModel.data}", where the “viewModel” variable is the ViewModel, set on the binding (more on that later).

To reference the bindable method annotated with @Bindable, you only need to specify viewModel.data. You can still access methods not annotated with @Bindable by using the full method name, such as viewModel.getData. As seen below, to set up a RecyclerView with data binding, just add a method reference from which to acquire the data.

Activity Layout File

Disclaimer: Some attributes, namespaces, etc. have been omitted to highlight how to use data binding.

RecyclerView Adapter Item Layout File

Disclaimer: Some attributes, namespaces, etc. have been omitted to highlight how to use data binding.

3. Set Up the ViewModel

The way you set up and use data binding is similar for both activities and fragments. Depending on the application’s need for the context, UI and lifecycle, you can reference the ViewModel by inflating and binding the View with the data binding library or by inflating it independently and binding to it with the library.

Next, call the appropriate ViewModel methods from the UI. One way to instantiate the binding is to use the DataBindingUtil’s setContentView method. Calling the binding’s setViewModel method sets the ViewModel variable reference, named “viewModel,” as depicted here:

Clean Coding Tip: Separate concerns and increase readability by providing individual methods for topics such as binding and RecyclerView initialization.

4. Implement the Adapter

When implementing the Adapter, the ViewModel needs to be set for the ViewHolder, binding and unbinding of the View. A lot of online examples don’t show unbinding the View, but it should be done to prevent problems.

5. Notify the Adapter for Data Set Changes

In this ViewModel, the data (items) are made available via the method getData(). When you need to notify the Adapter for data set changes, call notifyPropertyChanged(int) instead of calling notifyChange() (which would notify changes for all of the properties and likely cause issues).

6. Update the Method

This binding adapter method is the other part of the glue to update data in the Adapter. In the MVVM pattern chart, the ViewModel notifies the View of property changes by calling this method. Attribute data is referenced as app:data="@{viewModel.data}" and ViewModel.data references method getData, annotated with @Bindable. When combined with the call to notifyPropertyChanged(BR.data), this reference calls the RecyclerViewDataBinding.bind(RecyclerView, DataAdapter, List), annotated with @BindingAdapter({"app:adapter", "app:data"}).

Disclaimer: Although some readers may disagree with having an adapter reference in the ViewModel, this ViewModel provides notifications to the view. The components can be unit tested individually with JUnit and Mockito and together with integration / UI tests.

DataItemViewModel : BaseObservable

Model

7. Set the Default Component

To reuse data binding code among multiple classes, set your data binding component as the default component as shown below.

Clean Coding Tip: Provide a custom Data Binding Component class so you can abstract binding methods from ViewModels and isolate them for testability. Consider mocking the component class for better testing of the component classes.

8. Set Your Data Binding Class Accessor Methods

The data binding library requires classes using the @BindingAdapter annotation to have associated “get” accessor methods.

AppDataBindingComponent : android.databinding.DataBindingComponent

9. Set the Adapter on RecyclerView

This is where you can set the Adapter on RecyclerView and where adapter updates occur.

10. Click Event Handling

When a click event results in handling data accessible in the ViewModel, the best approach is to set the onClick attribute on the View in the bindable layout with android:onClick="@{viewModel::onClick.}" specified for the View. The ViewModel must have onClick(View) method implemented to handle the click event.

Tips for Keeping Your Code Clean

Some final tips from the trenches for Android data binding:

  • Making extra calls to notifyPropertyChanged(BR.data) or notifyChanged() can lead you down a path of producing bugs, including duplicated data.
  • There is a timing bug with the databinding library and use of ViewModels, extending BaseObservable, where calling notifyPropertyChanged(int) or notifyChanged() results in no action taking place. This occurs because the OnPropertyChangedCallback hasn’t been added yet. Until the bug is fixed, consider using this temporary fix: Add an OnPropertyChangedCallback to the ViewModel for handling the corresponding action. It may help to read the generated data binding classes to better understand the problem.
  • Debugging data binding build issues can be tricky. The error messages don’t provide a clear understanding as what the issues may be. Sometimes, an issue may be due to an incorrect object type passed into a binding adapter method. Other times, an issue may be caused by using data binding methods prior to setting the ViewModel.

DOWNLOAD SOURCE FROM GITHUB

At Phunware, we’re always working for better quality code. That means figuring out how to apply the latest technologies (such as data binding) to challenging, often high-profile projects. Interested in joining the Phamily? Check out our latest job openings and don’t forget to subscribe to our newsletter.

SUBSCRIBE TO THE NEWSLETTER

The post Android Data Binding with RecyclerViews and MVVM: a Clean Coding Approach appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/android-clean-coding-approach/feed/ 5
No More Norman Doors: The Importance of Design Thinking in Enterprise App Development http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/importance-design-enterprise-app-development/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/importance-design-enterprise-app-development/#comments Thu, 07 Dec 2017 18:09:06 +0000 http://127.0.0.1/?p=31394 Have you ever encountered a door whose usability signals are so poor that signage is needed to clarify how the door works? A glass door with a vertically-oriented grab handle, for example: does it open inward or outward? Users are left guessing and often frustrated or embarrassed when they inevitably choose wrong. These confusing doors […]

The post No More Norman Doors: The Importance of Design Thinking in Enterprise App Development appeared first on Phunware.

]]>
Have you ever encountered a door whose usability signals are so poor that signage is needed to clarify how the door works? A glass door with a vertically-oriented grab handle, for example: does it open inward or outward? Users are left guessing and often frustrated or embarrassed when they inevitably choose wrong.

Source: The Far Side by Gary Larson, 1980

These confusing doors are called Norman doors after Don Norman, a cognitive scientist and usability engineer who was inspired by many experiences with bad design to produce the seminal work, The Design of Everyday Things. Norman argued that good design is intuitive design—design that doesn’t require conscious thought to be usable.

When The Design of Everyday Things was released in 1988, the idea of user-centered design and applying design thinking to all areas of life was revelatory. Design thinking forces brands to ask the right questions, find the best solutions and implement the best approach to get results. Here’s what it looks like in its simplest form:

  • Ask questions and empathize
  • Understand and define
  • Evaluate ideas
  • Iterate

Although design thinking has been proven since 1988 to be a repeatable problem-solving approach in everything from business systems to software engineering, some business leaders still see design as a magical process, a superficial add-on or a way to use leftover budget dollars. But design should never be an afterthought. Without it at the heart of a project, usability problems will keep users away. Let’s explore how design thinking can be applied to enterprise mobile apps.

Applying Design Thinking to Enterprise Mobile Apps

App Strategy: Identifying Areas of Need

Good enterprise app design is not about making things look nice. It’s about creating a positive user experience and solving real problems for real people. Well designed apps remove frustration along the user journey, resolving pain points and anticipating issues users may not even realize they could have. These apps are able anticipate user needs because design thinking was central to their development process.

No matter how clever your designers and engineers are, real users have a way of uncovering use cases you would never think of and demonstrating how usability can be improved.

If you want your app to solve real problems for real users, do your research. Ask users what they need help with. If you already have an app, review existing engagement metrics to identify areas for improvement—where users abandon the app, for example, or which features they never seem to use. Analyze competitor apps to see where yours excels or falls short.

If you’re building a new app, conduct user testing or interviews with target users first. Get to know them and their needs, preferences, pain points and behaviors so you can make good design decisions and improve the user experience at every opportunity for iteration. No matter how clever your designers and engineers are, real users have a way of uncovering use cases you would never think of and demonstrating how usability can be improved.

Your app’s success depends on how engaged your users are. Learn more in this eBook: Sticky Notes: How to Re-Engage Your Users Like a Boss.

GET THE eBOOK

Solving Problems with Enterprise App Design

Once you uncover the problems your target users are experiencing, it’s time to use mobile to solve them. This part of the process can be messy, but it’s where you dig in as a team and experiment with strategies and ideas—even some that might seem crazy at first. Sketch, whiteboard, diagram, discuss, challenge and revise to uncover solutions worth trying. That’s where the magic happens.

It’s a good idea to look for an app development partner with expertise in Apple’s Human Interface Guidelines and Google Material Design to create workflows that are useful and compliant. App design must feel fresh and timely without being too trendy—and therefore doomed to fall out of style. Choose a partner that will build for beauty, function and longevity.

The next step is to use the findings and wireframes from your discovery process to form educated hypotheses about how your app’s design can be the solution to your customers’ problems. Choose forward-looking technologies that will provide deployment flexibility and extend the life of the finished app. And test, test, test—not just to identify and fix bugs right before launch, but throughout the design and engineering process (and even after launch). Continual testing allows you to keep refining the user experience and uncover unanticipated issues before they cause bigger problems.

Iteration and Improvement

Good design recognizes constraints like budget and business goals as essential considerations for finding the best and most appropriate solutions. Long gone are the days of hero design, when eureka-like moments of inspiration seemed to appear out of thin air. The more effective process is working as a team with your development partner and end users to create prototypes and iteratively improve upon your initial ideas.

Experience proves that the best solutions come from testing and getting feedback from business leaders, designers, engineers, testers and end users at every stage to confirm that prototypes effectively solve the right problems. You may choose an app development partner that has already found effective solutions to common problems in your industry, but make sure they also routinely iterate and improve on even the smallest details for the best outcome.

How Design Thinking Can Help Your Organization

The American Marketing Association has said that customer experience is the new battleground and the customer journey will take precedence from today forward. Customer experience can now make or break a company, and word spreads among consumers in near-real time. Because your app is likely to be a central part of your customers’ experience with your brand, it’s essential to apply human-centered design thinking to the app lifecycle, from start to finish.

In other words, don’t put any metaphorical Norman doors between your users and their goals. Even minor moments of frustration add up to a negative overall experience. Instead, put the time and effort into thinking through how your app will work in the real world to provide the smoothest, most effective user experience possible.

To learn more about best practices at every stage of the mobile lifecycle, check out the eBook Mobile First: Harnessing the App Lifecycle for Transformative Business Success.

DOWNLOAD THE EBOOK

The post No More Norman Doors: The Importance of Design Thinking in Enterprise App Development appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/importance-design-enterprise-app-development/feed/ 1
From Design to Dev: the Code Less Traveled http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/design-dev-code-less-traveled/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/design-dev-code-less-traveled/#respond Wed, 27 Sep 2017 17:13:29 +0000 http://127.0.0.1/?p=30297 Nobody defines themselves by only one thing. I’m really into design, for instance, but I’ve also always been interested in computers and what makes them tick. These somewhat divergent interests led me to take a winding road to my eventual career, yet they both contribute to my daily success. I went to The University of […]

The post From Design to Dev: the Code Less Traveled appeared first on Phunware.

]]>
Nobody defines themselves by only one thing. I’m really into design, for instance, but I’ve also always been interested in computers and what makes them tick. These somewhat divergent interests led me to take a winding road to my eventual career, yet they both contribute to my daily success.

I went to The University of Texas at Austin to study design, but halfway through college, I realized I was more interested in implementation than design alone. I’d worked with basic HTML and CSS since high school, so I started teaching myself Android development.

Mobile had grown immensely since I started college, so I started teaching myself Android development. I used the Commonsware book and took one Intro to Comp Sci class, but mostly I wore out my Google muscles reading everything I could find to help me learn how to code. Because my degree program was very open-ended, I ended up with an Android dev internship at a small start-up, planning to move into UX design.

After I completed my BFA, I joined the Phunware Phamily as a UI/UX designer, but I was still coding for fun on nights and weekends. Eventually, I switched to development completely, which is what I’m doing now—I’m a software engineer specializing in Android OS. I find that my background in design really enhances my development work, and my self-taught path has been its own kind of advantage. I’m living proof that many different routes can lead you to a successful coding career, if you’ve got the passion for it and the tenacity. Here are a few things I learned along the way.

The Value of Having a Design Background as a Developer

I personally find it a lot easier to design things and then code them. Because I know what the back-end can and can’t do, I can design with that in mind—rather than focusing only on aesthetics. I know where I can push boundaries and where it’d be better to stick to what the OS gives us.

There’s definitely a creative side to coding, but you also have to think like a computer. It’s more about knowing what’s available to you, being able to read through dev documentation and being able to say, “okay, will this take days or weeks to do, or can I build off of this?”

Here are a few examples of how a blended design / coding perspective helps:

  • When I’m starting a new project or updating an existing app, I review all of the newest features coming out on the OS so we can design accordingly. For example, Marshmallow 6.0 completely changed the way we ask for permissions. Rather than the user okaying everything at the beginning, we now have to ask ourselves, “When and where will the user be prompted for this permission?” Otherwise, we’ll be asking for camera permissions in the calendar and that’s no good. This was a big change—and we definitely have to take it into account when designing a UI for compatibility.
  • Material design is ever-evolving. It started off very colorful and not very textured, and now it’s edging toward more texture. Understanding the OS style shifts will help you mimic them in a way that feels natural. (It also helps you spot opportunities to buck the trend and go for your own thing.)
  • Moving forward, augmented reality (AR) is pushing the idea of digital objects existing in “real” space—the concept of voice-activated user interfaces also reflects this idea. I’m a huge sci-fi nerd, so I have a very particular vision of what the future should look like (hello, Minority Report and Iron Man’s suit). Pokémon GO opened people’s eyes to the possibilities of AR, and I really love the idea of having floating UIs accessible to anybody anywhere.

How to Kick Down the Door and Keep Growing

Let’s be real—it’s challenging to get into a development role without a Computer Science degree. Here are a few tips to help smooth your path.

  • Be ready to prove yourself. Back up your credentials with strong examples of your work on GitHub or Bitbucket.
  • Join a local professional organization (or three or four). I’d like to give a little shout out to Women Who Code. It’s a great tech community that offers a lot of scholarship opportunities to go to conferences (where women are often underrepresented), and those conferences are invaluable for face-to-face discussion and learning. I also recommend joining your local Android or iOS developer group and getting active. Don’t be afraid to get involved and give a talk—it’s a great way to network and build your confidence at the same time. Giving a talk is a great way to build your confidence and learn as you teach. I recently gave a talk at the Austin Android Developer Meetup about animations, which drew on my experience as both a designer and a developer. Find a topic you’re interested in and run with it.
  • Look for strong influencers in the dev community to follow on social. On the Android side, I learn a lot from following Chiu-Ki Chan (@chiuki) and Christina Lee (@RunChristinaRun). You can see what I’m up to at lorajk.com and @loraj_k.

The truth is, taking the scenic route into a coding career has ultimately increased the value I bring to the Phunware Phamily. My colleagues appreciate my design background as well as the MacGyver-like way I taught myself to code. Both add expertise and depth to my team and help us deliver better results for our customers.

Have you walked the code less traveled? Phunware is always looking for creative, dedicated and interesting people to join the Phamily. Check out our latest career opportunities—and enjoy the journey!

Want to learn more about our Android dev team? My colleague Ivy Knight shared our key takeaways from the 2017 Google I/O developer conference (including some really terrible Android puns).

READ IVY’S TAKEWAYS

The post From Design to Dev: the Code Less Traveled appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/design-dev-code-less-traveled/feed/ 0
Renew or Re-Do? What to Do with an Aging Mobile App http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/renew-or-re-do-aging-mobile-app/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/renew-or-re-do-aging-mobile-app/#respond Thu, 31 Aug 2017 21:25:19 +0000 http://127.0.0.1/?p=30058 For most organizations, building a mobile app is not a small endeavor. It takes a lot of time, energy and money to get your app out into the world. So what do you do when that mobile app isn’t delivering the results you hoped for? What if your goals have changed or your users are […]

The post Renew or Re-Do? What to Do with an Aging Mobile App appeared first on Phunware.

]]>
For most organizations, building a mobile app is not a small endeavor. It takes a lot of time, energy and money to get your app out into the world. So what do you do when that mobile app isn’t delivering the results you hoped for? What if your goals have changed or your users are asking for functionality that wasn’t in your original strategy? Should you add enhancements to your existing app, or is it time to start fresh with something new? And how do you determine which approach is best?

Because we approach mobile as a lifecycle, the Phunware team thinks about these questions all the time. It’s never just “build it and forget about it.” Phase 1: Strategize comes up over and over as we go through the mobile app lifecycle. And many times, customers come to us with an existing app that’s starting to age. In this case, Phase 1 entails defining the specific feature sets and use cases that are needed to support the customer’s business goals now.

So what types of add-on features do folks with existing apps want the most?

Our 6 Most-Requested App Enhancements

  • Monetize the app with advertising
  • Differentiate the experience between anonymous and registered users (e.g., adding personalizing and / or exclusive content for registered users)
  • Add indoor wayfinding or blue-dot navigation
  • Make the design more flexible
  • Add third-party integrations or technologies (e.g. media players, analytics, CRM, authentication)
  • Make the entire app portfolio more manageable

Interested in finding in monetizing your app? Download our info sheet, 6 Smarter Questions to Ask Your Monetization Partners, for insightful questions to ask your potential monetization partners.
DOWNLOAD THE eBOOK

6 Questions to Ask About Your Existing App

When deciding whether to add capabilities or start from scratch, the real issues lie in how the original app was built. Think of an app like a house—whether you want to add on a master bathroom or an entire second story, you have to start by making sure the existing structure is sound. If it’s not, that gorgeous master bathroom could come crashing down on your head.

Unless you’ve got an expert dev team in house, ultimately you’ll need a trusted development partner who can evaluate your existing app within the context of your desired enhancements.

Here are a few key questions to consider about your existing app:

  • Is the codebase organized and up-to-date?
  • Were any unconventional coding styles used?
  • What warnings and errors appear when inspecting the code?
  • Are all existing features working properly?
  • Is the architecture extensible? Can it support additional features and functionality?
  • Have any and all existing SDKs been implemented properly?

Carefully considering these questions might reveal some issues that need fixing before you can add enhancements. Though an app might work fine with a few errors or quirks in the architecture or codebase, you can run into real trouble when you start trying to build more on top of it. Really significant issues can multiply the development and testing efforts exponentially. At a certain point, it’s both financially and strategically wiser to start over.

How Do You Make a Smart Decision?

If your development partner spots any issues with your existing app, ask for two estimates: one for a “fix and enhance” solution and one for a complete rebuild.

  • The “fix and enhance” estimate should include a breakdown of the problems plus an estimate of the time and budget required to fix them, and any potential negative ramifications of performing these fixes. It should also include an estimate for adding any new requested features or capabilities.
  • The complete rebuild estimate should comprise building a new version of your existing app with better code and a more flexible and extensible infrastructure, along with the new features and capabilities you’re looking for.

Naturally, you’ll want to compare the two estimates. If the costs for a “fix and enhance” approach are lower than those for a complete rebuild—and they may well be—make sure you understand the possible ramifications. For example, if the original codebase has to be modified to accommodate a new feature or function, this modification might limit the app’s extensibility in the future or hinder some previously implemented integrations. Plus, the more “hot fixes” applied to the code, the greater the risk of failure.

In the long run, patching together a creaky old app could cost you more and more as time goes on and additional fixes are required. And if the user experience begins to suffer, you could be jeopardizing your audience and even your brand reputation. There may be good reason to go with a “fix and enhance” approach for now—just make sure you’re making the most strategic and well-informed decision you can.

Learn more about taking the strategic approach to mobile application lifecycle management in this comprehensive eBook: Mobile First: Harnessing the App Lifecycle for Transformative Business Success.

DOWNLOAD THE eBOOK

The post Renew or Re-Do? What to Do with an Aging Mobile App appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/renew-or-re-do-aging-mobile-app/feed/ 0
3 Must-Haves for Effective App Content Management http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/3-must-haves-effective-app-content-management/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/3-must-haves-effective-app-content-management/#respond Thu, 15 Jun 2017 18:37:38 +0000 http://127.0.0.1/?p=29697 Let’s face it: when you’re looking for the right partner to help you create a mobile app and win across the application lifecycle, app content management is probably the last thing you’re thinking about. How you’re going to upload and manage text, images and other assets is about the farthest thing from the “bells and […]

The post 3 Must-Haves for Effective App Content Management appeared first on Phunware.

]]>
Let’s face it: when you’re looking for the right partner to help you create a mobile app and win across the application lifecycle, app content management is probably the last thing you’re thinking about. How you’re going to upload and manage text, images and other assets is about the farthest thing from the “bells and whistles” that tend to excite people when they think about mobile technology.

But if you’re thinking that way, you may well be missing something. App content management is really about creating the framework for the “guts” of your app—and doing it wisely is crucial to your success. So let’s take a look at three important content management features you should look for, and the surprising reasons why they matter.

1. Cloud-based syncing of assets.

When all of the assets in your app are hosted in the cloud, it’s much easier to push new or refreshed content without forcing an app store update—which users typically dislike. For example, let’s say a high-rise mixed-use property has a branded resident app. With cloud-synced content management, app admins can add an extra image to the app’s main carousel to promote an upcoming community event or a new property amenity. Without cloud syncing, if the app was programmed for only three carousel images, this would force an app store update. For another example, cloud-synced content management would make it easy for a healthcare provider app to keep physician directories up to date, even across dozens of campuses.

Cloud-synced content management also improves app user experience because it protects the user experience in the event of lost internet connectivity. Content is cached on the user’s device when it is connected to the internet, making it accessible even if disconnected. New or changed content and other app updates are synced frequently, when the device is connected. For example, if your app includes mapping and wayfinding and a user loses connectivity in the parking garage, the app will simply use the most recent cached version of maps and routes and re-sync the moment the user is back within range of Wi-Fi or beacons.

2. Flexibility to add, delete or change fields.

When you load content into your app—an image, a chunk of text, etc.—you have the ability to add metadata fields that help define and categorize that particular element. The app platform uses these tags to serve different content to different people based on their profiles. Those tags also help the app platform capture data around whether, how and when any user engages with a particular piece of content.

That’s pretty standard. The real magic happens when you look at the data collection potential. If the content management user interface is truly flexible, you can add fields, delete fields or change the data types of your assets. This allows you to capture data on interactions you might not have planned for initially, or to serve different campaign needs. The more granular you can be on those events, the more specifically you can understand each user.

As an example, let’s say an activewear brand app wants to capture how much users are engaging with the running shoe category on their app. To set this up in content management, we might add custom fields to each running shoe asset that establish parameters for engagement levels. Users who interact with this content beyond a certain threshold might then be classified as “Running Gearheads” and targeted with content or sales messages that specifically match their interests. Those who don’t reach that threshold might be considered “light users” and receive different content / engagement.

Pro tip: It’s a great idea to add as much granularity in your data collection (via tagging or adding meta data) as possible from the outset, even if you think you don’t “need to know” that information. Later on, all of that data might just come in handy!

Curious about how data can help you in your user acquisition and engagement efforts? Download our eBook to learn more about how you to harness the power of mobile data and turn it into smart strategy.

DOWNLOAD THE eBOOK

3. The ability to match schemas to your CRM.

Although contextual mobile data can be a huge complement to other customer data captured in your CRM (customer relationship management) solution, many brands prefer not to integrate the two platforms completely. But if you can set up the data fields in your mobile app content management platform according to the schema used in your CRM, you can merge the mobile data from your CRM seamlessly without needlessly exposing your entire CRM database. At the same time, you’d be helping your mobile app partner understand your needs better, so they can make smarter recommendations for further data collection.

Ultimately, having a cloud-synced and highly flexible app CMS boils down to ease of use and better data. The more you can program into the content management user interface, the easier it will be for non-technical staff to manipulate with successfully. At the same time, the more flexible the CMS, the better you can use it to support your data collection needs now, and in the future. (Because, as we like to say, you can’t collect it if you don’t tag it.)

Effective content management is just one key to success in your mobile strategy. Want to know more about the rest of the of the app lifecycle? Download our eBook to learn about the mobile application lifecycle and how you can use it to reach your business goals.

DOWNLOAD THE eBOOK

Want to learn more questions you should asking a potential mobile partner? Check out our Best Practices for Writing a Mobile RFP. If you’d like specific information on Phunware’s content management, visit our content management solution page/.

The post 3 Must-Haves for Effective App Content Management appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/3-must-haves-effective-app-content-management/feed/ 0
Phunware Team Takeaways from Google I/O 2017 http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/phunware-takeaways-google-io-2017/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/phunware-takeaways-google-io-2017/#respond Fri, 02 Jun 2017 20:34:18 +0000 http://127.0.0.1/?p=29675 Returning to work after an action-packed conference is always an effort, and for the Phunware team that attended Google I/O, that felt especially true. This year, our group of ten attendees included members from multiple departments including engineering and creative. We were delighted and excited by what we saw and learned. We were stoked to […]

The post Phunware Team Takeaways from Google I/O 2017 appeared first on Phunware.

]]>
Returning to work after an action-packed conference is always an effort, and for the Phunware team that attended Google I/O, that felt especially true. This year, our group of ten attendees included members from multiple departments including engineering and creative. We were delighted and excited by what we saw and learned.

We were stoked to interact with and learn from the Android community, and especially to see the current and potential uses for new Google Assistant features. Here are some takeaways from our favorite sessions, things we’re looking forward to and a little Android-related “phun.”

What Excited You Most at Google I/O 2017?

We asked our group to weigh in on the announcements and products they were most inspired and excited by at Google I/O 2017. Here’s what they had to say:

“Apart from the new Android announcements (like Kotlin and Android Architecture Components), I was most excited about the other conference attendees. Seeing so many passionate developers in one place really gets me inspired.”
– Dustin Tran, Software Engineer (DT)

“I was most excited about Kotlin and the new Android Architecture Components stuff, but I am also very interested in the Google Assistant API and writing apps for that platform. Android Things was also really cool to see in action.”
– Alex Stolzberg, Software Engineer (AS)

“I was most excited about the incredible community collaboration focus this year. So many of the announcements came about because the Android dev community asked for specific things. Google recognized that and invited non-Googlers from the community on stage for the first time ever.”
– Jon Hancock, Software Engineer (JH)

“I really enjoyed talking to some of the Google design team and going to the sessions on the Google Assistant.”
– Ivy Knight, UX / UI Designer (IK)

“I was really excited to be a part of such a huge conference—and to hang out with the California-based Phunware devs I only see every couple of years.”
– Sean Gallagher, Software Architect (SG)

“What was I most excited about at I/O? The amount of code and time we can save with Kotlin and the new Architecture Components.”
– Nick Pike, Software Architect (NP)

Want to stay up to date on the latest and greatest in mobile news? Subscribe to our monthly newsletter!
SUBSCRIBE TO THE NEWSLETTER

What Was the Most Impressive Session at I/O 2017?

Thanks to our ten-person Phunware team, we were able to attend a broad selection of the 150+ sessions offered at I/O this year. Which impressed us the most?

“I was most impressed by What’s New in Android, where we learned about many tools—like an official Android emulator with Google Play pre-installed, and Android Profiler which allows precise and real-time app monitoring—that will make Android development much easier. Equally impressive were the Architecture Components sessions. Google has realized that developers often have to solve the same problems: network calls to retrieve data through orientation changes and caching / persisting that data. Now, they’re providing easier-to-use and standardized components to utilize when implementing these common use cases.”
– DT

“My favorite session was probably the Android Things talk about Developing for Android Things in Android Studio.”
– AS

“My favorite session was Introduction to Kotlin because of the sheer number of jaw-dropping moments. “
– JH

Building Apps for the Google Assistant got me excited to try building an Assistant app myself. API.ai looks great.”
– IK

“My favorite session was the Office Hours during which we got some really good one-on-one time with Android NDK team devs. They answered a lot of tough questions. Not only were they helpful, they were also great folks!”
– SG

Life is Great and Everything Will Be Ok, Kotlin Is Here! (Pretty self-explanatory, right?)”
– NP

How About the Best I/O 2017 Puns?

One of the best things about attending conferences like I/O is the inside jokes. In case you’re feeling left out, here are some of the Phunware team’s favorite (terrible) Android-related puns:

“An Android app walks into a bar. Bartender asks, ‘Can I get you a drink?’ The app says, ‘That was my _intent_!'”
– DT

Ok Google, give me an Android-related pun…”
– AS

“Android puns just require too much Context.”
– JH

“Can’t wait to check out all the FABulous Materials at I/O.”
– IK

“Need some space to store your app data? Google just gave us lots of Room.”
– NP

Interested in joining the Phunware Android dev team and possibly heading to I/O yourself next year? Check out our open opportunities and apply today!

The post Phunware Team Takeaways from Google I/O 2017 appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/phunware-takeaways-google-io-2017/feed/ 0
Dos and Don’ts for Conference Attendees http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/dos-and-donts-for-conference-attendees/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/dos-and-donts-for-conference-attendees/#comments Mon, 01 May 2017 17:41:04 +0000 http://127.0.0.1/?p=29270 Because the mobile world moves so fast, everybody at Phunware works hard to stay up-to-date on the latest trends and technologies. I’m sure it’s the same in your world. That’s what makes conferences so important. I’m serious! Conferences offer a lot more than just networking. They’re a really great way to take a deep dive […]

The post Dos and Don’ts for Conference Attendees appeared first on Phunware.

]]>
Because the mobile world moves so fast, everybody at Phunware works hard to stay up-to-date on the latest trends and technologies. I’m sure it’s the same in your world. That’s what makes conferences so important.

I’m serious! Conferences offer a lot more than just networking. They’re a really great way to take a deep dive into specific areas of your business—alongside some of the best and brightest—so you can get inspired and informed all at the same time.

Three weeks ago, Phunware Senior QA Engineer Pavan Kovurru and I went to the Selenium Conference (“SeConf” for short) in Austin. Selenium is an important testing tool that our Product QA team uses to make sure everything we develop at Phunware works properly. Pavan and I attended SeConf for the opportunity to upgrade our Test Automation knowledge and implement the latest tech into our codebase. But we also learned a lot that translates to any conference you might go to.

  1. DON’T miss the keynote speeches.
    These are usually presented by true visionaries, and the insights revealed illustrate why they are leaders in their field. For example, one of the keynotes at SeConf was presented by Jim Evans of Salesforce.com, who gave a really engaging personal history of his 25 years in the software industry. For many in the audience who weren’t even born when he began his career, Jim’s talk provided incredible perspective from someone who’s been deeply involved in the open-source Selenium project for years.
  2. DO treat every break as another opportunity to learn.
    Whether it’s time for coffee or lunch (or tacos, because it’s Austin), each break is a chance to meet someone and discover something new. During a break at SeConf, a fellow attendee opened my eyes to what it’s like to start a tech career as the only Automation Tester in a team of 15 developers—and the only female, to boot! (And I thought my job was tough…)

  3. DO divide and conquer whenever possible.
    If you’re not going solo, plan out the conference with your teammates so you can cover as many interesting talks as possible. Afterwards, you can share notes and discuss what you learned. This way, your team can cover more territory, which benefits everybody. Pavan and I definitely approached SeConf this way—and some of the most notable talks we attended were also recorded:

  4. DON’T multitask.
    This one may sound counterintuitive, but resist the urge to hop on your laptop and work on other stuff during talks. There is a reason why you chose to spend time at a conference. Your computer and work will still be there waiting for you after the talk. Make good use of that time to be fully present, so you can take everything in.
  5. DO remember that knowledge is like happiness—it’s meant to be shared.
    This may be the most important tip I can give you: Share as much about what you’ve learned with your entire team and your company. One of the things Pavan and I shared with the Phunware QA team was inspired by a panel given by the CTO of Appli Tools, in which he talked about using his company’s own tools to test its highly complex visual test automation service. Following the panel, we successfully completed a proof-of-concept using Appli Tools to compare how different devices render mobile pages across many versions of test builds. After sharing what we learned with our team, Pavan and I helped save everybody time and effort—and saved the company money.

In the end, the whole Product QA team benefitted from Pavan’s and my attendance at SeConf, so it was more than worth our time. Want to know where Phunware will turn up next? Check out our upcoming conferences and events. Hopefully, we can meet up over some coffee (or maybe a taco…).

The post Dos and Don’ts for Conference Attendees appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/dos-and-donts-for-conference-attendees/feed/ 1
Best Practices for Writing a Mobile RFP http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/best-practices-writing-mobile-rfp/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/best-practices-writing-mobile-rfp/#respond Thu, 06 Apr 2017 22:17:30 +0000 http://127.0.0.1/?p=29126 Requests for proposal (RFPs) are an important part of any major custom mobile initiative. It’s common for large or complex businesses to use RFPs to solicit mobile solution ideas and strategies, particularly if they have a physical venue with which the application is supposed to interact (stadium, hospital, port, mixed-use city center, etc.). Businesses don’t […]

The post Best Practices for Writing a Mobile RFP appeared first on Phunware.

]]>
Requests for proposal (RFPs) are an important part of any major custom mobile initiative. It’s common for large or complex businesses to use RFPs to solicit mobile solution ideas and strategies, particularly if they have a physical venue with which the application is supposed to interact (stadium, hospital, port, mixed-use city center, etc.).

Businesses don’t ask for proposals for point solutions or products they can buy off the shelf. By their nature, RFPs are strategic—they’re for finding custom, long-term solutions that fit in with the soliciting company’s larger vision and goals. If you’re structuring an RFP, do it thoughtfully and carefully. The requirements you put forth can significantly impact the quality of the mobile solution (and partner) you end up with.

I have already discussed the importance of understanding mobile as an iterative cycle and suggested ways to structure your org and team for success across that cycle. Given that branded mobile application portfolios tailored to richly defined mobile application experiences have the power to drive enterprise-wide digital transformation, let’s discuss how your procurement group can structure RFPs to facilitate that transformation.

If you plan for the full mobile lifecycle at the RFP stage, you will be able to find a mobile partner who doesn’t just build your app, but can help you acquire, engage and monetize your mobile audience in line with your business goals.

Mobile RFP Requirements for the “Strategize” Phase

In the mobile application lifecycle, the “Strategize” phase involves defining the amazing experience you want users to have with your app, then outlining use cases and feature sets.

If your project is complex enough to necessitate an RFP, it’s important to select a partner with proven success in similar, equally complex projects.

To uncover indicators of a bidder’s experience and capabilities, ask for examples of:

  • Custom flagship applications / application portfolios for known brands. This demonstrates credibility, along with creative and technical expertise.
  • Live event applications supporting live-streaming video and other real-time content. This demonstrates that the bidder has robust, stable technology and the ability to integrate with third-party providers.
  • Successful in-venue mobile experiences at stadiums, hospitals, etc. This demonstrates the technical capabilities required to integrate with a host of hardware providers to create a seamless, engaging mobile user experience. Bidders should be prepared to support location technologies including high- and low-density Wi-Fi and both physical and virtual beacons.

You will also want to define functionality and feature set requirements in this part of your mobile RFP. As applicable, request the following:

  • Simple app content management
  • Context-triggered user engagement via push notifications and mobile engagement
  • Indoor wayfinding and navigation
  • Video streaming
  • Campaign and app analytics
  • The flexibility to integrate with multiple third-party software providers (such as customer relationship management [CRM] platforms, electronic health record [EHR] systems, loyalty, commerce, etc.)

Mobile RFP Requirements for the “Create” Phase

Source: eMarketer, March 2017 During the “Create” phase of the mobile application lifecycle, you bring your app to life in the way that makes the most sense for your budget, timeline and in-house capabilities. Depending on your unique timeline and goals, you can expect a mobile solution to cost anywhere from $100K to $500K (if you’re licensing software), or $1M to $5M (if you’re building a completely custom application).

If these numbers seem high, think about how much you allocate to other channels (email marketing, field marketing, etc.). Now think about how much time your target customers spend in those channels relative to how much time they spend on mobile. This simple exercise in perspective will help you see how worthy of an investment you’re making.

A word to the wise: avoid the temptation to go with the cheapest technology option, which may come disguised as a so-called “write once, run anywhere” codebase. Non-native development is a shortsighted non-solution to a very real problem, and bidders who propose HTML-based or other non-native development will not be equipped to support you through the mobile lifecycle. Look for a bidder who proposes a native mobile solution, built to the design standards of each platform and leveraging the uniquely mobile capabilities of each operating system and device. (Here are those design standards, if you’re interested: Apple Human Interface Guidelines, tvOS Human Interface Guidelines, Google Material Design, Android TV Guidelines.)

Mobile RFP Requirements for the “Launch” Phase

During the “Launch” phase of the mobile app lifecycle, you work to get your app noticed and build the audience that will use it. This requires a partner with robust performance marketing and data science capabilities. Make sure that your RFP requests detailed information on how the bidder will support app discovery and user acquisition, backed by real-world examples of launch and audience building success with other apps.

Also make sure to request a thorough explanation of the bidder’s customer support approach during and after the launch period. The more complex the project, the more support you should expect and request.

Mobile RFP Requirements for the “Engage and Monetize” Phase

In the “Engage and Monetize” phase of the mobile application lifecycle, you map out how you intend to keep app users engaged and drive revenue through your mobile efforts. There are many options—from a simple download charge or subscription model for your app to in-app purchases, in-app advertising, driving sales and foot traffic via contextually relevant messaging and more. And all of it generates troves of valuable data that you’d be foolish not to leverage.

To identify a prospective partner who can appropriately support this phase, your mobile RFP should ask that bidders describe their background in supporting:

  • In-app monetization with various ad formats as well and a top-quality network (if relevant)
  • In-app purchases and loyalty program integrations
  • SMS-based marketing campaign planning and execution, as well as the features and capabilities of any contextual marketing tools offered
  • Leveraging contextual data to impact marketing, operations and other business-critical initiatives

As you can see, the earlier you start planning for all phases of your mobile application lifecycle—particularly by addressing them up front in your RFP—you not only streamline the process, but increase your chances of driving real returns on your mobile investment.

I hope you’ve enjoyed this series on how to position your organization for success across the mobile application lifecycle—in essence, get your head right, get your team right and get your RFP right so you can find the right mobile partner. The mobile space is inherently partner-driven and collaborative, but there’s nothing worse than being stuck with a mobile provider who can’t (or won’t) act as a strategic partner.

For more tips and insights, feel free to get in touch. In the meantime, why not learn about the ins and outs of the mobile app lifecycle? Download our eBook on the subject below.

DOWNLOAD THE eBOOK

The post Best Practices for Writing a Mobile RFP appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/best-practices-writing-mobile-rfp/feed/ 0
App Development: Should You Do It In-House or Outsource It? http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/app-development-should-you-do-it-in-house-or-outsource-it/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/app-development-should-you-do-it-in-house-or-outsource-it/#comments Wed, 05 Aug 2015 14:25:38 +0000 http://127.0.0.1/?p=21010 It’s December 24th. I’m staring at this year’s hottest gift, the Barbie Dreamhouse™. There are hundreds of pieces spread across the living room and instructions written in eight languages, none of which make any sense to me. Time is not on my side either. My four-year-old will walk down the stairs in a few hours […]

The post App Development: Should You Do It In-House or Outsource It? appeared first on Phunware.

]]>
It’s December 24th. I’m staring at this year’s hottest gift, the Barbie Dreamhouse™. There are hundreds of pieces spread across the living room and instructions written in eight languages, none of which make any sense to me.

Time is not on my side either. My four-year-old will walk down the stairs in a few hours and I don’t think she’ll understand the concept of “assembly required” from Santa. So I dig in and prepare for a long night.

After watching several YouTube instructional videos and enlisting some family members (who worked for beer), we finish the project. It was stressful, time-consuming and certainly outside of my skillset. Seeing my daughter’s eyes light up in the morning was rewarding, but I’m not sure her reaction would have been any different if there had been an outsourcing option to build the pink monstrosity. She might have liked it better, in fact—we couldn’t get the elevator to work.

The in-house vs. outsource dilemma plagues businesses too, particularly when it comes to mobile development. Many companies are scrambling to generate mobile apps in-house, believing that it’s cheaper, easier, faster, more controllable and more efficient. If your organization is weighing this decision, consider the following.

Challenges of In-House App Development

Three main challenges arise when businesses attempt to keep all mobile app development in-house:

1. The skillset struggle is real.

Even with up to 15 developers working on their mobile app initiatives, 94 percent of organizations don’t have the necessary mobile development staff to tackle all of their needs. Almost half of software solutions architects and senior software developers say there’s a gap in the skills required for mobile development.

Android and iOS development require different and fairly complicated coding languages—Java for Android and Objective C or Swift for iOS. The average Android developer can’t just switch over to coding for iOS without additional training or study. Creating apps for both platforms effectively means two development efforts and skillsets.

2. It’s expensive and time-consuming.

A bare-bones internal mobile development team might consist of a mobile designer, one or two developers, a project manager and a quality assurance (QA) engineer. Even if you already have some of these folks on staff, you likely need to hire at least one person. It can take weeks to get the HR process rolling and find the right person, and even more time to get them fully on board (average of 3-6 months).

Recruitment and hiring don’t just take time. They take money. Consider the cost of advertising job listings, hiring recruiters, performing background checks and covering relocation expenses—not to mention the developer’s six-figure salary and the cost of technology, licensing fees, software certificates and more.

3. Developing mobile apps in-house can be risky.

If you decide to keep all of your mobile app development in-house, how can you be sure your team’s skills are top-notch? Are you savvy enough to differentiate between a decent coder and a mobile expert? Most people aren’t.

Scalability can also become an issue with an in-house team. What if your project scope expands? As we’ve already established, it’s not so easy to just plug in an additional coder. Accountability can also present challenges. Without specific mobility expertise, decision-makers may struggle to identify the nature and root causes of any problems that arise, leaving the project stalled out without a plan for moving forward.

Mobile is everywhere. Stay up to date on the latest mobile news, trends and content with our monthly newsletter!

SUBSCRIBE TO THE NEWSLETTER

Advantages of Outsourcing App Development

Outsourcing your mobile app development to a firm that specializes in mobile can be a very strategic decision—one that saves you time, hassle, and money while yielding a better-quality product. Here are a few advantages of letting someone else handle your mobile app development:

  1. Fixed costs for a specific scope and delivery.
  2. Less lag time: An outside team can usually start immediately.
  3. Synergy: An established team will have a solid working relationship with each other and with the required technologies.
  4. Accountability: A good mobile firm will give you a solid contract and scope of work, with clearly defined responsibilities and terms. If a mistake or delay occurs, you will have a dedicated account rep to address the problem. There’s a lot less to worry about.
  5. Access to plug-and-play features and modules: Many app features and modules are relatively standard. It’s how you use them that makes the app unique and special. An experienced app development team will have an existing library of these standard products already tested and optimized. There’s no need to build every feature from scratch when you can simply customize a proven solution. This saves time and money, while ensuring performance.
  6. Greater experience and expertise: Because of their focus on mobility, an outsourced team will be on top of the latest trends and technologies. They can share best practices gained from extensive experience and ensure that your app is in line with your vision and your target audience. A dedicated mobile expert can remove the guesswork and put your company and its app in the best possible situation to succeed.
  7. Options: You can outsource part or all of your app development. You can split the work, outsourcing iOS development while keeping Android in-house (or vice versa). You can use outsourced staff augmentation to fill gaps in your in-house development strategy. Or you can outsource the app discovery process, letting third-party pros develop your roadmap.

Ultimately, this decision comes down to cost and risk. Businesses are under intense pressure to maintain a competitive presence in the mobile space, and it’s natural to want to keep mobile development in-house. It just doesn’t make the best business sense. And your proverbial Barbie Dreamhouse might end up with a non-working elevator.

Ready to put mobile first and win? Download Mobile First: Harnessing the App Lifecycle for Transformative Business Success for a strategic and tactical model with actionable items for every stage of the process.

DOWNLOAD THE eBOOK

The post App Development: Should You Do It In-House or Outsource It? appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/app-development-should-you-do-it-in-house-or-outsource-it/feed/ 1
The Perfect Moment for Enterprise Mobile Apps http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/perfect-moment-enterprise-mobile-apps/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/perfect-moment-enterprise-mobile-apps/#respond Thu, 11 Dec 2014 11:07:36 +0000 http://127.0.0.1/?p=16302 Enterprise technology is exciting again. (ZDNet even calls it “sexy.” Who knew? Well, actually, we did.) How did this happen? After a sleepy couple of decades, several factors aligned to create the perfect moment for an enterprise mobility revolution. First of all, the incredible proliferation of mobile devices has created an enormous sea change. Nearly […]

The post The Perfect Moment for Enterprise Mobile Apps appeared first on Phunware.

]]>
Enterprise technology is exciting again. (ZDNet even calls it “sexy.” Who knew? Well, actually, we did.)

How did this happen? After a sleepy couple of decades, several factors aligned to create the perfect moment for an enterprise mobility revolution.

First of all, the incredible proliferation of mobile devices has created an enormous sea change. Nearly one-third of employees would rather lose their wallets than lose their personal mobile devices. Smartphone and tablet users have become accustomed to easy access to new apps, great design, and an “it just works” user experience. At the same time, they’ve also become more comfortable with iterative improvements—so long as those updates happen seamlessly, without a lot of fuss and effort.

Behind the scenes, the maturity of cloud hosting and the flourishing open-source movement have made advanced functionality easier than ever to develop, access and support. Capabilities imported from social media are opening up new levels of collaboration and synergy.

Add it all up, and conditions are ripe for real innovation in the enterprise space.

BUT the enterprise is facing a huge bottleneck in developing mobile apps. A recent survey of enterprises in the U.S. and the U.K. found that 85 percent had a development backlog of at least one app—and 50 percent had more than 10 apps stuck in the pipeline.

The post The Perfect Moment for Enterprise Mobile Apps appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/perfect-moment-enterprise-mobile-apps/feed/ 0
API, SDK—WTF? Understanding the Mobile App Alphabet Soup http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/api-sdk-wtf-understanding-mobile-app-alphabet-soup/ http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/api-sdk-wtf-understanding-mobile-app-alphabet-soup/#respond Thu, 04 Dec 2014 04:11:44 +0000 http://127.0.0.1/?p=16081 Photo by Kyle Mills Hall When it comes to digital and mobile technology, an awful lot of acronyms get tossed around. Everybody just nods and smiles, but many of us don’t really know what all that jargon really means…and nobody wants to raise their hand and say, “Hey, what IS an API, anyway? What about […]

The post API, SDK—WTF? Understanding the Mobile App Alphabet Soup appeared first on Phunware.

]]>
Photo by Kyle Mills Hall

When it comes to digital and mobile technology, an awful lot of acronyms get tossed around. Everybody just nods and smiles, but many of us don’t really know what all that jargon really means…and nobody wants to raise their hand and say, “Hey, what IS an API, anyway? What about an SDK?”

It’s understandable. If you don’t work with them every day, you don’t have much reason to know a lot about SDKs or APIs. We’re going to break them down for you in plain English, so at your next cocktail party or digital marketing meeting, you can throw around those acronyms like a boss (or at least a knowledgeable digital player).

What’s an API?

“API” stands for “application programming interface.” It’s an interface that specifies the way two applications or systems can interact with each other. It lays the ground rules for the conversation.

We all deal with APIs every day. For example, let’s say you’re hungry for pizza. When you open Yelp and search for “pizza,” the app sends a request to the back-end database, using the appropriate API. That request looks something like this: “getNearbyPOIForLatLong():”

The Yelp API requires certain parameters in your request so that the back-end system knows which data to return—latitude and longitude for your location, the search term (in this case, “pizza”), and a search radius (how far to search around your location). Now the Yelp database knows what information to retrieve, the app can display that info on a map or as a list, and you can get your pineapple anchovy pizza with extra olives ASAP.

What’s an SDK?

Pizza-Notification-Phone“SDK” is short for “software development kit.” It refers to a set of pre-written code, documentation and programming tools that developers can use as the foundation for creating new software applications. A mobile SDK is a kit designed specifically for creating apps for mobile devices.

Let’s say your neighborhood Pizza Plaza wants to build an app that incorporates push notifications—those little messages that pop up on your smartphone’s lock screen—to tell customers when their order is ready or when they’re running a special deal on calzones.

To enable this communication, the Pizza Plaza developer(s) might use Phunware’s push notifications SDK. The SDK would contain everything the developer needs to deliver the right message to the right recipient at the right time. Using Phunware’s SDK would make adding push notifications pretty much plug-and-play, saving Pizza Plaza’s dev team a lot of time and hassle.

The value of the SDK is that Pizza Plaza does not have to build all of the services associated with the functionality (in this case, push notifications). Instead, Pizza Plaza can focus on making delicious pizzas and running its business—not on app development.

Interested in learning more about these kinds of push notifications? Learn all about mobile marketing automation in our eBook: Mobile Marketing Automation: Why It Matters and How to Get Started.

DOWNLOAD THE eBOOK

What does all of this mean to you?

If you are considering building a mobile app (or having one built) for your business, SDKs and APIs are your friend. They can make the whole process much more efficient because a developer can use SDKs and APIs to add functionality without having to reinvent the wheel each time.

Here at Phunware, we can make you a custom mobile app from concept through completion. We also have pre-packaged SDKs that give your developers access to Phunware features. For example, our advertising SDK can be added to an existing app, giving it the ability to run ads from the Phunware Advertising network in part of the app’s real estate.

To sum it all up: an SDK helps you build the app. An API lets the app communicate with various web services to deliver really cool functionality. And Phunware is here to help. Get in touch to learn more!

CONTACT US

The post API, SDK—WTF? Understanding the Mobile App Alphabet Soup appeared first on Phunware.

]]>
http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/api-sdk-wtf-understanding-mobile-app-alphabet-soup/feed/ 0
The iOS 7 Survival Guide for Mobile Application Developers http://ec2-35-80-203-221.us-west-2.compute.amazonaws.com/ios-7-survival-guide-mobile-application-developers/ Thu, 26 Sep 2013 10:23:13 +0000 http://tapit-qa.enniscreates.com/?p=865 While much of iOS 7 may seem cosmetic and surface level, mobile application developers now have access to amazing new capabilities for enhancing user-experiences to keep people coming back for more…we created a ten page guide to help you capitalize on the opportunity. Download Phunware iOS 7 Survival Guide The iOS 7 Survival Guide for […]

The post The iOS 7 Survival Guide for Mobile Application Developers appeared first on Phunware.

]]>
Screenshot 2014-06-03 17.03.44

While much of iOS 7 may seem cosmetic and surface level, mobile application developers now have access to amazing new capabilities for enhancing user-experiences to keep people coming back for more…we created a ten page guide to help you capitalize on the opportunity.

Download Phunware iOS 7 Survival Guide

The post The iOS 7 Survival Guide for Mobile Application Developers appeared first on Phunware.

]]>