multi source

Results 1 - 25 of 149Sort Results By: Published Date | Title | Company Name
Published By: Adobe     Published Date: Mar 06, 2015
Responsive web design is an integral part of customer engagement in our multi-device world. But the additional costs and resources can be hard to justify for budget holders demanding proof of ROI.
Tags : 
responsive web design, customer engagement, roi, adobe
    
Adobe
Published By: Adobe     Published Date: Apr 30, 2015
Responsive web design is an integral part of customer engagement in our multi-device world. But the additional costs and resources can be hard to justify for budget holders demanding proof of ROI.
Tags : 
adobe, application development, delivery, roi, digital investments, responsive web design
    
Adobe
Published By: Adobe     Published Date: Sep 20, 2016
Download our new data-driven marketing guide, The Holistic Picture, to learn how multiple sources of data can be transformed into unified customer profiles. And see how you can use those profiles to create personalized experiences your customers will love.
Tags : 
adobe, data driven marketing, marketing, marketer, marketing guide, customer experience
    
Adobe
Published By: Akamai Technologies     Published Date: Dec 05, 2018
DDoS attack size doubled in early 2018 after attackers discovered and employed a new, massive DDoS reflection and amplification method with the potential to multiply their attack resources by a factor of 500K. The attack vector, called memcached UDP reflection, uses resources freely exposed on the internet — no malware or botnet required.
Tags : 
    
Akamai Technologies
Published By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : 
cost effective, data storage, data collection, security, compliance, platform, big data, it resources
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services, Inc     Published Date: Nov 21, 2013
Watch this on-demand webinar to learn the steps and available tools necessary to design, setup, and deploy infrastructure to run a multi-server Microsoft SharePoint Server Farm on AWS. Solution Architect Ulf Schoo will cover how to architect for high availability and provision the relevant AWS services and resources to run SharePoint Server workloads at scale on the AWS Cloud. Also included with the video is the Microsoft SharePoint Server on AWS: Reference Architecture Whitepaper.
Tags : 
aws sharepoint, sharepoint hosting, sharepoint server, sharepoint server farm, enterprise sharepoint, sharepoint webinar, sharepoint cloud, sharepoint migration, sharepoint solutions, amazon sharepoint, aws whitepaper, sharepoint applications, sharepoint workloads, online sharepoint hosting
    
Amazon Web Services, Inc
Published By: Anaplan     Published Date: Apr 02, 2019
Connected organizations collaborate across business functions to dynamically steer business performance. Previous generations of planning software have fallen short of this vision, making collaboration difficult to achieve, scattering data across multiple sources, and providing inflexible planning models that require heavy IT support. This landscape motivated Anaplan to develop an innovative platform that enables Connected Planning across the entire enterprise. The FSN Innovation Showcase highlights three major innovations that support these objectives: • Anaplan’s proprietary Hyperblock® technology • The App Hub, a suite of 250+ industry-leading solutions • Developments in machine learning and artificial intelligence
Tags : 
    
Anaplan
Published By: Anaplan     Published Date: Apr 09, 2019
Connected organizations collaborate across business functions to dynamically steer business performance. Previous generations of planning software have fallen short of this vision, making collaboration difficult to achieve, scattering data across multiple sources, and providing inflexible planning models that require heavy IT support. This landscape motivated Anaplan to develop an innovative platform that enables Connected Planning across the entire enterprise. The FSN Innovation Showcase highlights three major innovations that support these objectives: • Anaplan’s proprietary Hyperblock® technology • The App Hub, a suite of 250+ industry-leading solutions • Developments in machine learning and artificial intelligence
Tags : 
    
Anaplan
Published By: Appcito     Published Date: Apr 09, 2015
Appcito CAFE (Cloud Application Front End) is an easy-to-deploy, uni?ed and cloud-native service enabling cloud application teams to select and deploy enterprise-grade L4 to L7 application network services. The multi-cloud CAFE service is available for the OpenStack open-source cloud computing software platform. OpenStack is a multivendor ecosystem used to deploy Infrastructure-as-a-Service (IaaS) solutions. It allows users to bring compute, storage, and networking resources into private and public clouds through a set of open APIs.
Tags : 
openstack, cloud applications, load balancing, application security, cloud application delivery
    
Appcito
Published By: AppDynamics     Published Date: Sep 21, 2017
IDC’s research shows enterprises around the world are using multicloud strategies to optimize the performance of modern and existing legacy applications running on-premises, in public cloud services, and on legacy systems. In the early days of enterprise cloud adoption, many organizations focused their cloud strategies on enabling net-new cloud-native applications written to take advantage of dynamic cloud infrastructure and pay-as-you-go consumption-based cost models. Early success with these implementations is convincing more and more enterprises to expand their cloud footprint and to migrate existing applications to cloud in order to enhance end-user experiences, optimize cloud resource utilization and costs, and create a more flexible and agile business environment.
Tags : 
    
AppDynamics
Published By: AppDynamics     Published Date: Sep 25, 2017
IDC’s research shows enterprises around the world are using multicloud strategies to optimize the performance of modern and existing legacy applications running on-premises, in public cloud services, and on legacy systems. In the early days of enterprise cloud adoption, many organizations focused their cloud strategies on enabling net-new cloud-native applications written to take advantage of dynamic cloud infrastructure and pay-as-you-go consumption-based cost models. Early success with these implementations is convincing more and more enterprises to expand their cloud footprint and to migrate existing applications to cloud in order to enhance end-user experiences, optimize cloud resource utilization and costs, and create a more flexible and agile business environment.
Tags : 
    
AppDynamics
Published By: AppDynamics     Published Date: Sep 25, 2017
IDC’s research shows enterprises around the world are using multicloud strategies to optimize the performance of modern and existing legacy applications running on-premises, in public cloud services, and on legacy systems. In the early days of enterprise cloud adoption, many organizations focused their cloud strategies on enabling net-new cloud-native applications written to take advantage of dynamic cloud infrastructure and pay-as-you-go consumption-based cost models. Early success with these implementations is convincing more and more enterprises to expand their cloud footprint and to migrate existing applications to cloud in order to enhance end-user experiences, optimize cloud resource utilization and costs, and create a more flexible and agile business environment.
Tags : 
    
AppDynamics
Published By: ASG Software Solutions     Published Date: Nov 05, 2009
Effective workload automation that provides complete management level visibility into real-time events impacting the delivery of IT services is needed by the data center more than ever before. The traditional job scheduling approach, with an uncoordinated set of tools that often requires reactive manual intervention to minimize service disruptions, is failing more than ever due to todays complex world of IT with its multiple platforms, applications and virtualized resources.
Tags : 
asg, cmdb, bsm, itil, bsm, metacmdb, workload automation, wla, visibility, configuration management, metadata, metacmdb, lob, sdm, service dependency mapping, ecommerce, bpm, workflow, itsm, critical application
    
ASG Software Solutions
Published By: Attunity     Published Date: Jan 14, 2019
This whitepaper explores how to automate your data lake pipeline to address common challenges including how to prevent data lakes from devolving into useless data swamps and how to deliver analytics-ready data via automation. Read Increase Data Lake ROI with Streaming Data Pipelines to learn about: • Common data lake origins and challenges including integrating diverse data from multiple data source platforms, including lakes on premises and in the cloud. • Delivering real-time integration, with change data capture (CDC) technology that integrates live transactions with the data lake. • Rethinking the data lake with multi-stage methodology, continuous data ingestion and merging processes that assemble a historical data store. • Leveraging a scalable and autonomous streaming data pipeline to deliver analytics-ready data sets for better business insights. Read this Attunity whitepaper now to get ahead on your data lake strategy in 2019.
Tags : 
data lake, data pipeline, change data capture, data swamp, hybrid data integration, data ingestion, streaming data, real-time data, big data, hadoop, agile analytics, cloud data lake, cloud data warehouse, data lake ingestion, data ingestion
    
Attunity
Published By: Aventail     Published Date: Aug 21, 2009
NAC is a multifaceted framework to thoroughly control who and what gets access to network resources, and help keep malware from entering the enterprise. Today, there are huge challenges to implementing as-yet immature NAC solutions on an enterprise-wide basis, including convoluted integration requirements, inadequate inspection capabilities, and weak policy management.
Tags : 
vpn, access control, security management, ssl, network access control, nac, policy management, security policies, malware, intrusion prevention, ssl, secure socket layer, vpn, virtual private network, vpns, virtual private networks, aventail
    
Aventail
Published By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
    
AWS
Published By: AWS     Published Date: Oct 26, 2018
Today’s organisations are tasked with analysing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organisations are finding that in order to deliver analytic insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store enormous amounts of data in a central location, so it’s readily available to be categorised, processed, analysed, and consumed by diverse groups within an organisation? Since data—structured and unstructured—can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
data, lake, amazon, web, services, aws
    
AWS
Published By: Black Duck Software     Published Date: May 18, 2010
In this webinar Black Duck Software (www.blackducksoftware.com), together with representatives of SAP, will review the benefits open source offers to development organizations, the management challenges it presents, and approaches for addressing those challenges.
Tags : 
black duck software, open source, platform, sap, multi-source, code, source
    
Black Duck Software
Published By: Black Duck Software     Published Date: Oct 01, 2009
Over the past decade, a powerful new approach to development - open source software - has risen to prominence, dramatically increasing the opportunity to re-use existing software.
Tags : 
black duck software, open source, platform, sap, multi-source, code, source, intellectual property
    
Black Duck Software
Published By: Black Duck Software     Published Date: Feb 17, 2009
"Agile" software development is an increasingly popular development process for producing software in a flexible and iterative manner that can deliver value to the enterprise faster, reduce project risk and allow adaptation to changes more quickly.
Tags : 
black duck software, open source, platform, sap, multi-source, code, source, agile development
    
Black Duck Software
Published By: Black Duck Software     Published Date: Jul 16, 2010
This paper is for IT development executives looking to gain control of open source software as part of a multi-source development process. You can gain significant management control over open source software use in your development organization. Today, many IT executives, enterprise architects, and development managers in leading companies have gained management control over the externally-sourced software used by their application development groups. Download this free paper to discover how.
Tags : 
open source, development, architects, application development, software development
    
Black Duck Software
Published By: Blue Coat     Published Date: Nov 23, 2015
It’s time for Proactive Incident Response: *Full logs in the SIEM, plus complete collection of packet data *Packet data indexed, easily searchable and correlated with threat intelligence and other data. Working with multiple sources of security data, threats are detected early.
Tags : 
    
Blue Coat
Published By: BMC ASEAN     Published Date: Dec 18, 2018
Big data projects often entail moving data between multiple cloud and legacy on-premise environments. A typical scenario involves moving data from a cloud-based source to a cloud-based normalization application, to an on-premise system for consolidation with other data, and then through various cloud and on-premise applications that analyze the data. Processing and analysis turn the disparate data into business insights delivered though dashboards, reports, and data warehouses - often using cloud-based apps. The workflows that take data from ingestion to delivery are highly complex and have numerous dependencies along the way. Speed, reliability, and scalability are crucial. So, although data scientists and engineers may do things manually during proof of concept, manual processes don't scale.
Tags : 
    
BMC ASEAN
Published By: Box     Published Date: Jan 16, 2015
The nature of the financial services industry places a myriad of international compliance requirements on a company's IT team, as well as an expectation by its customers to deliver the high test levels of performance and reliability. To survive and thrive, businesses in the industry must not only keep pace with customer demand but gain competitive advantage. Those demands mean the IT team must be at the forefront of adopting emerging technologies. This is certainly true for Orangefield Columbus, who recently experienced significant growth in its multiple databases which led to the serious performance degradation of its existing storage system. By focusing on a proactive data management storage array, Orangefield was able to eliminate resource contention. Download now and examine Orangefield's journey to find a solution that would meet, and exceed, their performance and capacity requirements.
Tags : 
nexgen, vmware, citrix, flash, financial services
    
Box
Start   Previous   1 2 3 4 5 6    Next    End
Search Offers      
Get your company's offers in the hands of targeted business professionals.