multi source

Results 1 - 25 of 142Sort Results By: Published Date | Title | Company Name
Published By: Salsify     Published Date: Mar 15, 2019
With over 10,000 SKUs across two divisions and multiple brands, seasonal and home decor manufacturer and distributor The Gerson Companies needed a better way to organize product market data and expand across hundreds of retailers. After investing in product experience management, the company was able to centralize product information in Salsify and empower its network of independent retailer with the data needed to sell successfully online. Featuring: Orin Borgelt, Chief Technology & Sales Officer Learn more about the step-by-step approach The Gerson Companies team took to take control of their data and increase sales on the digital shelf: Build a centralized, flexible, and accessible source of production information to arm all divisions of Gerson with the most up-to-date product inventory. Meet requirements for retailers. The Gerson Companies uses Salsify to syndicate product information across the digital shelf for their B2C divisions. Develop a new sales channels: Gerson uses Sa
Tags : 
    
Salsify
Published By: ServiceNow     Published Date: Mar 13, 2019
The move to multicloud and DevOps comes complexity and accelerated development cycles. IT operations is being hindered by the lack of visibility between clouds and on-premise resources. Multicloud capabilities need to enable operations to move to a proactive posture, have end to end visibility, and operation at scale. Get the IDC brief to learn more.
Tags : 
    
ServiceNow
Published By: Group M_IBM Q119     Published Date: Mar 05, 2019
In this digital world, fast and reliable movement of digital data, including massive sizes over global distances, is becoming vital to business success across virtually every industry. The Transmission Control Protocol (TCP) that has traditionally been the engine of this data movement, however, has inherent bottlenecks in performance (Figure 1), especially for networks with high, round-trip time (RTT) and packet loss, and most pronounced on high-bandwidth networks. It is well understood that these inherent “soft” bottlenecks arcaused by TCP’s AdditiveIncrease-Multiplicative-Decrease (AIMD) congestion avoidance algorithm, which slowly probes the available bandwidth of the network, increasing the transmission rate until packet loss is detected and then exponentially reducing the transmission rate. However, it is less understood that other sources of packet loss, such as losses due to the physical network media, not associated with network congestion equally reduce the transmission rate.
Tags : 
    
Group M_IBM Q119
Published By: Nice Systems     Published Date: Feb 26, 2019
NICE WFM 7.0’s Forecaster unlocks a high level of transparency into interaction history, allowing you to centrally forecast, schedule and manage contacts between multiple locations and ensure that site- and enterpriselevel objectives are met. With more than two thousand customers and two million users depending on its unparalleled ability to fine-tune the most precise forecasts, Forecaster allows you to plan and respond to the peaks and valleys of customer history through automatic collection of key historical data from all types of contact sources: • Automatic call distributors (ACDs) • Outbound dialers • Multi-channel routing platforms • Back-office employee desktops Download today to learn more.
Tags : 
    
Nice Systems
Published By: Larsen & Toubro Infotech(LTI)     Published Date: Feb 18, 2019
The largest national multiline insurance had built a repository of Insurance policies (P&C and Life Insurance) on Microfilm and Microfiche in early 90’s, as a preservation strategy. They were grappling with issues as this technology became outdated over time: • Risk of losing their only source of data for Insurance policies and corresponding communication, need to improve data availability and speed of claims evaluation • Compliance issues, need of a WORM (write once read many) storage compliant with FINRA regulations, where data should be encrypted when at rest • Total cost for digitization compared to 10-12 years of support left to maintain insurance policies was not very encouraging • Required a low cost, cloud-based, FINRA-compliant document management solution which could provide quick access to stored data Download complete case study to know how LTI’s e-Office sDownload full case study to know how LTI’s e-Office solution enabled 50% TCO for Largest national Multiline Insurance.
Tags : 
    
Larsen & Toubro Infotech(LTI)
Published By: HERE Technologies     Published Date: Feb 12, 2019
Get higher quality, more accurate location data – and a safer, more profitable fleet – by choosing the right location services provider. The true value of a location platform comes from bringing together multiple data sources and presenting them in a meaningful way. Using a platform approach, you can help customers differentiate their service, increase margins and increase safety. So, in this guide, we cover the four key considerations for choosing a mapping and location service platform to ensure a high quality, accurate mapping service for you and your customers. Download the eBook
Tags : 
location data, transport & logistics, location services
    
HERE Technologies
Published By: Attunity     Published Date: Jan 14, 2019
This whitepaper explores how to automate your data lake pipeline to address common challenges including how to prevent data lakes from devolving into useless data swamps and how to deliver analytics-ready data via automation. Read Increase Data Lake ROI with Streaming Data Pipelines to learn about: • Common data lake origins and challenges including integrating diverse data from multiple data source platforms, including lakes on premises and in the cloud. • Delivering real-time integration, with change data capture (CDC) technology that integrates live transactions with the data lake. • Rethinking the data lake with multi-stage methodology, continuous data ingestion and merging processes that assemble a historical data store. • Leveraging a scalable and autonomous streaming data pipeline to deliver analytics-ready data sets for better business insights. Read this Attunity whitepaper now to get ahead on your data lake strategy in 2019.
Tags : 
data lake, data pipeline, change data capture, data swamp, hybrid data integration, data ingestion, streaming data, real-time data, big data, hadoop, agile analytics, cloud data lake, cloud data warehouse, data lake ingestion, data ingestion
    
Attunity
Published By: Okta APAC     Published Date: Dec 19, 2018
Multifactor authentication (MFA) is a critical security requirement for every organization, regardless of size and industry. But not every MFA solution is created equal. Your investment in MFA should be well-thought-out. Not only does your chosen MFA solution need to meet the requirements you have today, but it needs to enable and secure the future growth and evolution of your business. This evaluation guide will help you understand the key factors you need to consider when investing in MFA. Your MFA choice determines how well you can protect access to all your internal resources, as well as your ability to provide frictionless experiences for your internal employees and external customers. Your MFA choice also impacts your ability to grow your business and take advantage of emerging technologies and future innovations. As part of Okta’s identity-led security framework, Okta Adaptive MFA delivers on the key considerations that need to be part of your search for a solution that meets a
Tags : 
    
Okta APAC
Published By: BMC ASEAN     Published Date: Dec 18, 2018
Big data projects often entail moving data between multiple cloud and legacy on-premise environments. A typical scenario involves moving data from a cloud-based source to a cloud-based normalization application, to an on-premise system for consolidation with other data, and then through various cloud and on-premise applications that analyze the data. Processing and analysis turn the disparate data into business insights delivered though dashboards, reports, and data warehouses - often using cloud-based apps. The workflows that take data from ingestion to delivery are highly complex and have numerous dependencies along the way. Speed, reliability, and scalability are crucial. So, although data scientists and engineers may do things manually during proof of concept, manual processes don't scale.
Tags : 
    
BMC ASEAN
Published By: Akamai Technologies     Published Date: Dec 05, 2018
DDoS attack size doubled in early 2018 after attackers discovered and employed a new, massive DDoS reflection and amplification method with the potential to multiply their attack resources by a factor of 500K. The attack vector, called memcached UDP reflection, uses resources freely exposed on the internet — no malware or botnet required.
Tags : 
    
Akamai Technologies
Published By: HERE Technologies     Published Date: Dec 05, 2018
Get higher quality, more accurate location data – and a safer, more profitable fleet – by choosing the right location services provider. The true value of a location platform comes from bringing together multiple data sources and presenting them in a meaningful way. Using a platform approach, you can help customers differentiate their service, increase margins and increase safety. So, in this guide, we cover the four key considerations for choosing a mapping and location service platform to ensure a high quality, accurate mapping service for you and your customers. Download the eBook
Tags : 
location data, transport & logistics, location services
    
HERE Technologies
Published By: effectual     Published Date: Dec 03, 2018
Multi-Cloud, hybrid strategies add complexity Nearly 60% of businesses say they're moving toward hybrid IT enviornments that integrate on-premises systems and public cloud resources and enable workloads to be pleaced according to performance, security and dependency requirements. Identifying the best execution venue is a key cloud hurdle.
Tags : 
cloud enablement, cloud optimization, digital transformation, cloud migration, cloud computing, cloud architecture
    
effectual
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: Group M_IBM Q418     Published Date: Oct 02, 2018
Across enterprises of all kinds, data is multiplying rapidly in both quantity and variety. Across multi-cloud environments, new sources are exponentially increasing the growing stream of information, including the Internet of Things, social media, mobile devices, virtual reality implementations and optical tracking.
Tags : 
    
Group M_IBM Q418
Published By: HERE Technologies     Published Date: Sep 26, 2018
Get higher quality, more accurate location data – and a safer, more profitable fleet – by choosing the right location services provider. The true value of a location platform comes from bringing together multiple data sources and presenting them in a meaningful way. Using a platform approach, you can help customers differentiate their service, increase margins and increase safety. So, in this guide, we cover the four key considerations for choosing a mapping and location service platform to ensure a high quality, accurate mapping service for you and your customers. Complete the form to the right to receive the ebook today.
Tags : 
location data, transport & logistics, location services
    
HERE Technologies
Published By: Kronos     Published Date: Sep 24, 2018
Selecting and implementing a software solution like an ERP or workforce management solution that touches multiple parts of the organization is never easy. It is a significant investment of money, time, and resources. But in the end, the capability, efficiency, and savings of both time and money make the effort well worth it. Particularly in the public sector, the sooner you can achieve a return on any investment, the better.
Tags : 
    
Kronos
Published By: Workday     Published Date: Sep 19, 2018
Hoarding data isn’t doing much to help your financial services firm if you can’t easily combine data from multiple sources and quickly run analytics. But there is a way to turn those heaps of data into actionable insights to get clearer answers to your biggest questions and better drive your firm’s strategy. Read the blog to learn how to improve your back end to go from data hoarding to decision-making.
Tags : 
    
Workday
Published By: CA Technologies EMEA     Published Date: Sep 11, 2018
Reconstructing resource management tools to simplify tasks, drive collaboration and facilitate action. People who use technologies at home expect the same ease of use and intuitiveness for the tools they use in the workplace. But enterprise tools aren’t keeping up with the pace of change. Many product and portfolio management (PPM) solutions force resource managers to create entire workflows or navigate through multiple screens just to see the fundamental component of resource management—what their people are working on. Resource managers also cite shortcomings such as no simple way to perform everyday tasks, communicate in context with others, drill down into key information, narrow the field of resources and forecast financials and model outcomes.
Tags : 
    
CA Technologies EMEA
Published By: Quantum Corporation     Published Date: Sep 11, 2018
Virtualization is rapidly changing the way business IT operates, from small local businesses to multinational corporations. If you are reading this, chances are good that your company is already taking advantage of virtualization’s benefits. Virtualization means that a single underlying piece of hardware, such as a server, runs multiple guest operating systems to create virtual machines, or VMs, with each of them being oblivious to the others. An administrative application, such as VMware, manages the sharing process, allocating hardware resources, memory, and CPU time to each VM as needed. And all applications look at this software construct exactly as if it were a real, physical server — even the VM thinks it’s a real server! Virtualization makes good financial sense. It enables a single server to offer multiple capabilities that otherwise would require separate servers. It includes native high availability features, so you don’t have to use any more complex clustering tools. This ab
Tags : 
    
Quantum Corporation
Published By: Zendesk Ltd     Published Date: Sep 11, 2018
As companies increasingly look to provide a better experience for customers, offering support across multiple channels is becoming more popular than ever. According to the Aberdeen Group, companies doubled the number of channels they use to interact with customers between 2012 and 2017. But there’s a difference between providing support on a few channels and delivering a truly integrated omnichannel solution. Using the Zendesk Benchmark, our crowd-sourced index of customer service interactions from 45,000 participating organizations across 140 countries, we examined why companies are going omnichannel and what sets the companies using Zendesk for omnichannel support apart from everyone else.
Tags : 
    
Zendesk Ltd
Published By: Mitutoyo     Published Date: Aug 24, 2018
Today’s factory intelligence is the collaborative orchestration of people and machines. By blending intelligence from multiple sources, factories are truly becoming smarter. But it’s not just about the machines. Rather, people are improving machines through a true partnership that turns human experience into smarter machines. This shift marks a re?orientation to a thought process that’s more natural to people. After decades of working with 2D blueprints, designs and dimensioned drawings, factory intelligence leverages 3D models that are easier for people to comprehend and use. Plus, many people find more satisfaction in intelligent factory work that challenges them to manage relationships among machines, rather than repeatedly working on a single piece in a line. In smarter factories, people direct machines to adjust for shifting customer requirements and market demands.
Tags : 
factory intelligence, 3d models, market demands, model-based enterprise, supply chain, precision metrology, cmm
    
Mitutoyo
Published By: Google     Published Date: Aug 23, 2018
"Cloud platforms are rewriting the way that companies work, serving as a vital foundation for digital transformation. Companies should brace for challenges that will need to be met as they transition from in-house systems to hybrid-cloud, multi-cloud, and public-cloud environments. Learn how an open-source strategy and consistent governance will help your company use multi-clouds to compete in the digital world. Download the Harvard Business Review Analytic Services report and find out more."
Tags : 
    
Google
Published By: Dell PC Lifecycle     Published Date: Aug 13, 2018
Your business may need to keep track of dozens of different initiatives—but that doesn’t mean you need dozens of separate storage solutions to get the job done. To reduce complexity, your business may consider storage solutions that can take care of multiple jobs at once without sacrificing performance. For example, if you operate a brick-and-mortar store and an online store, you should be able to retrieve customer data from both sources without compromising transactional database performance. The all-flash Dell EMC™ SC5020 storage array aims to be just such a solution.
Tags : 
    
Dell PC Lifecycle
Published By: Dell PC Lifecycle     Published Date: Aug 13, 2018
Your business may need to keep track of dozens of different initiatives—but that doesn’t mean you need dozens of separate storage solutions to get the job done. To reduce complexity, your business may consider storage solutions that can take care of multiple jobs at once without sacrificing performance. For example, if you operate a brick-and-mortar store and an online store, you should be able to retrieve customer data from both sources without compromising transactional database performance. The all-flash Dell EMC™ SC5020 storage array aims to be just such a solution.
Tags : 
    
Dell PC Lifecycle
Published By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
    
Amazon Web Services
Start   Previous   1 2 3 4 5 6    Next    End
Search Offers      
Get your company's offers in the hands of targeted business professionals.