Sometimes we get asked by customers "what’s the minimum network speed requirements for BlueWorx?" That’s a good question, unfortunately the answer is that it depends on several factors including:

  • The size of the Sync data:
    • The number and size of the Work Orders and Notifications – since these transfer in their entirety each sync. By size I mean the number of operations, technical objects, etc
    • The number, size and stability of the Technical Objects and associated elements – since these generally transfer as delta changes but it still takes time to work out what you need. By size I mean the breath of data maintained for each item in terms of the core asset records, classification characteristics, measurement points, long text, etc
    • The use and size of material records, and by size I mean Plants, Storage Locations, Serial and Batch data.
    • The use of BlueWorx Crew
    • The use and transfer of documents
    • The use of GEO spatial maps, layers, offline content 
  • The variation of profiles users have and the degree of change that this creates between sync speeds for different users
  • Network loads during different times and over different mediums. The availability of Wi-Fi versus cellular data and the performance variations of these. i.e. sometimes an available Wi-Fi connection may be considerably slower than a cellular one, often there are variations of performance (both cellular and Wi-Fi) in locations, even in the same Plant/ locale
  • The SAP system loads during different time periods, and the variation of system specifications and performance between development, test and production SAP instances.
  • SAP Gateway and Web Dispatcher performance or or other reverse proxy/ intermediary network appliances
  • ESRI ARC GIS systems response speeds - if used
  • The Neptune Software release
  • The BlueWorx release
  • The types of devices in use and their performance

Independent of these technical variables is the important basic question of ‘what’s acceptable to your users’ and when does the sync speed they experience begin to negatively impact on their effective use of the tool? And that may be influenced by non-technical factors like previous application experience (SAP and mobile), work processes and pressures, industrial relations, etc. For example, if you do two syncs a day and carry on working while they process then you’ll be more likely to accept a slower speed. Conversely in other industries the the need for speed in a  more dynamic workplace might mean less tolerance to longer sync times.

While actual device and application trials at work places would provide the best metrics, that’s not always practical for a number of reasons. So another way to help get a sense of the impact of the network performance is to simulate the connection speeds. A suggested approach is covered in the following section.

Simulation Testing

Using Chrome Browser in Developer Mode with the option ‘Throttling the Network’ you can simulate various network speeds. A suggested approach is:

  • Create a BlueWorx Performance Test Profile that simulates the data load likely to be experienced by the target users. This may vary by Region, Plant, Job Role, etc
  • Apply the Test Profile to a BlueWorx Test User
  • Run a Full Sync (BlueWorx application Home menu > hamburger menu lower right > Perform Sync) with the Test User in a Chrome Browser from your Desktop. The reason that we don’t use a device browser is because we can’t go into debug mode which we use later on in the test. Time a Sync from Sync Start to Sync Completed. Note that while you can get the overall times from the BlueWorx Sync Logs in the admin app, using a stop watch is just as easy. Re run a few times to get an average
  • Now do the time as the previous step in the BlueWorx application on device/s – i.e. iOS, Android, etc, to get their average. Use the same connection as the browser – i.e. the same WiFi at the same location.
  • Determine the relative difference between the browser and device application sync speeds. I.e. the Browser syncs may be 20% faster. This is your Baseline.
  • Now in the Chrome Browser on desktop, get the fasted device connection possible (i.e. LAN), go into Chrome Developer Mode, Network View, and select the speed you want to simulate. As with previous steps run a series of full Syncs, record the times and get an average.
  • Repeat all these steps, but instead of Full Syncs do Delta Sync.

You can now see the relative performance variations between Sync using the various, simulated, network speeds. Take into account that percentage difference from your baseline testing. The lowest acceptable limit, as previously discussed, is something only you can assess.

Note: The Chrome Browser settings for Network Speeds allow you to add your own speeds. So you can run connection speed tests at your intended locations using readily available internet test sites (i.e. factory floors, remote cell connections), and create your own Network simulation speeds. And it’s important if you do this that it’s done from the actual workplace and not admin locations at the Plant, as these are invariably faster.

Testing Limitations

This testing is about seeing the impact of degridated network performance. It's not a true benchmark test for data loads because:

  • Primary testing is being undertaken on a desktop browser.  The browser doesn’t take as much data as does the mobile application and different storage technologies are used. For example, a browser doesn’t store documents, nor confirmation history, nor technical object long text, as when Blueworx is run in a web browser this information is retrieved from the connected SAP system on demand.
  • A large part of the sync job time can be the assembly and retrieval of data in SAP. That will vary according to the performance of the SAP instance and the loads its under when testing.

If you want to undertake Device and Profile Sync performance testing, and not Network performance testing and bench-marking, then take a look at the BlueWorx Admin Sync Logs. From there you can run tests on a variety of devices and export to excel for ‘slice and dice’ reporting.