consolidation

Results 1 - 25 of 368Sort Results By: Published Date | Title | Company Name
Published By: Illusive Networks     Published Date: Apr 10, 2019
During periods of rapid growth, your business is especially vulnerable to cyberattacks from both malicious insiders, and external threat actors. Extended periods of IT change and consolidation can open seemingly minor security gaps that can quickly become gaping holes attackers will exploit. This quick read will enrich your internal dialog about how to prepare for elevated risk of high-impact cyberattacks.
Tags : 
m&a, mergers and acquisitions, business infrastructure, external threats, cyber attacks, vulnerability management, business it, it security, network security, cyber risk, deception technology, endpoint security, illusive networks, lateral movement, enterprise security
    
Illusive Networks
Published By: Oracle     Published Date: Jan 28, 2019
Oracle Private Cloud Appliance is a converged infrastructure system designed for rapid and simple deployment of private cloud at an industry-leading price point. Whether customers are running Linux, Microsoft Windows or Oracle Solaris applications, Oracle Private Cloud Appliance supports consolidation for a wide range of mixed workloads in medium-to-large sized data centers. High-performance, low-latency Oracle Fabric Interconnect and Oracle SDN allow automated configuration of the server and storage networks. The embedded controller software automates the installation, configuration, and management of all infrastructure components at the push of a button. Customers need to enter only basic configuration parameters and create virtual machines (VMs) manually or by using Oracle VM Templates to get a full application up and running in a few hours. With Oracle Enterprise Manager, the Oracle Private Cloud Appliance is transformed into a powerful private cloud infrastructure that integrates
Tags : 
    
Oracle
Published By: Gigaom     Published Date: Jan 15, 2019
This recorded one-hour webinar will show you the benefits of end-to-end solutions and storage consolidation.
Tags : 
nutanix, storage, storage infrastructure, storage consolidation, hybrid cloud, memory storage
    
Gigaom
Published By: BMC ASEAN     Published Date: Dec 18, 2018
Big data projects often entail moving data between multiple cloud and legacy on-premise environments. A typical scenario involves moving data from a cloud-based source to a cloud-based normalization application, to an on-premise system for consolidation with other data, and then through various cloud and on-premise applications that analyze the data. Processing and analysis turn the disparate data into business insights delivered though dashboards, reports, and data warehouses - often using cloud-based apps. The workflows that take data from ingestion to delivery are highly complex and have numerous dependencies along the way. Speed, reliability, and scalability are crucial. So, although data scientists and engineers may do things manually during proof of concept, manual processes don't scale.
Tags : 
    
BMC ASEAN
Published By: Vena Solutions     Published Date: Oct 29, 2018
Based on in-depth research and customer interviews, the annual Nucleus Research Value Matrix map out the corporate performance management (CPM) market landscape, evaluating vendors on a matrix contrasting usability and ease-of-use versus features and depth of functionality. Read or download the 2018 edition to uncover the most up-to-date CPM landscape, to find the best finance software solution for your needs, and to see why Vena led the pack in usability to land in the Leader quadrant for the third straight year.
Tags : 
nucleus research, cpm matrix, cpm technology value, cpm technology matrix 2018, excel replacement, financial planning, budgeting, forecasting, financial modeling, business modeling, finance what-if scenarios, financial close and consolidation, risk and audit management, erp data integration systems, adaptive insights, hyperion, anaplan, prophix, vena solutions
    
Vena Solutions
Published By: Group M_IBM Q418     Published Date: Sep 10, 2018
LinuxONE from IBM is an example of a secure data-serving infrastructure platform that is designed to meet the requirements of current-gen as well as next-gen apps. IBM LinuxONE is ideal for firms that want the following: ? Extreme security: Firms that put data privacy and regulatory concerns at the top of their requirements list will find that LinuxONE comes built in with best-in-class security features such as EAL5+ isolation, crypto key protection, and a Secure Service Container framework. ? Uncompromised data-serving capabilities: LinuxONE is designed for structured and unstructured data consolidation and optimized for running modern relational and nonrelational databases. Firms can gain deep and timely insights from a "single source of truth." ? Unique balanced system architecture: The nondegrading performance and scaling capabilities of LinuxONE — thanks to a unique shared memory and vertical scale architecture — make it suitable for workloads such as databases and systems of reco
Tags : 
    
Group M_IBM Q418
Published By: Dell EMC & Intel     Published Date: Sep 06, 2018
Datacenter improvements have thus far focused on cost reduction and point solutions. Server consolidation, cloud computing, virtualization, and the implementation of flash storage capabilities have all helped reduce server sprawl, along with associated staffing and facilities costs. Converged systems — which combine compute, storage, and networking into a single system — are particularly effective in enabling organizations to reduce operational and staff expenses. These software-defined systems require only limited human intervention. Code imbedded in the software configures hardware and automates many previously manual processes, thereby dramatically reducing instances of human error. Concurrently, these technologies have enabled businesses to make incremental improvements to customer engagement and service delivery processes and strategies.
Tags : 
    
Dell EMC & Intel
Published By: Dell EMC & Intel     Published Date: Sep 06, 2018
Jusqu’à présent, les améliorations du datacenter se sont limitées à la réduction des coûts et à des solutions ponctuelles. La consolidation des serveurs, le Cloud computing, la virtualisation et l’implémentation de stockage Flash ont contribué à réduire la prolifération des serveurs, ainsi que les coûts de personnel et d’installations associés. Regroupant ressources de calcul, de stockage et de réseau au sein d’une même solution, les systèmes convergés se révèlent particulièrement efficaces dans la baisse des dépenses de personnel et de fonctionnement. Ces systèmes définis par logiciel (software-defined) exigent peu d’interventions humaines. Le code intégré dans le logiciel configure le matériel et automatise de nombreux processus autrefois manuels, ce qui réduit considérablement le risque d’erreurs humaines. Ensemble, ces technologies ont permis aux entreprises d’améliorer progressivement les processus et stratégies d’engagement client et de prestation de services.
Tags : 
    
Dell EMC & Intel
Published By: Pure Storage     Published Date: Sep 04, 2018
Veritas' NetBackup software has long been a favorite for data protection in the enterprise, and is now fully integrated with the market-leading all-flash data storage platform: Pure Storage. NetBackup leverages the FlashArray API for fast and simple snapshot management, and protection copies can be stored on FlashBlade for rapid restores and consolidation of file and object storage tiers. This webinar features architecture overviews as well as 2 live demo's on the aforementioned integration points.
Tags : 
    
Pure Storage
Published By: SAS     Published Date: Aug 28, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: Workday     Published Date: Aug 07, 2018
Just like a house needs a good foundation, a finance system needs the right technology. Workday took a fresh approach to technology to give finance a distinct advantage in analytics, close and consolidation, growth, and more. See how Workday helps finance become a strategic partner.
Tags : 
technology, analytics, strategic partner
    
Workday
Published By: BlackLine     Published Date: Aug 06, 2018
The biotechnology and pharmaceutical industry is among the most heavily regulated industries in the world, challenged by evolving regulations, complex compliance requirements and close regulatory scrutiny. At the same time, companies must address the market pressures of globalization, the use of predictive data analytics and digital technologies, and the industry’s ongoing consolidation. In this challenging environment, confidence in internal controls is crucial.
Tags : 
    
BlackLine
Published By: Hewlett Packard Enterprise     Published Date: Jul 18, 2018
How hyperconverged infrastructure can reduce costs and help align enterprise IT with business needs. Includes chapters on hyperconvergence and cloud, datacenter consolidation, ROBO deployment, test and development environments, and disaster recovery.
Tags : 
    
Hewlett Packard Enterprise
Published By: IBM     Published Date: Jun 29, 2018
LinuxONE from IBM is an example of a secure data-serving infrastructure platform that is designed to meet the requirements of current-gen as well as next-gen apps. IBM LinuxONE is ideal for firms that want the following: ? Extreme security: Firms that put data privacy and regulatory concerns at the top of their requirements list will find that LinuxONE comes built in with best-in-class security features such as EAL5+ isolation, crypto key protection, and a Secure Service Container framework. ? Uncompromised data-serving capabilities: LinuxONE is designed for structured and unstructured data consolidation and optimized for running modern relational and nonrelational databases. Firms can gain deep and timely insights from a "single source of truth." ? Unique balanced system architecture: The nondegrading performance and scaling capabilities of LinuxONE — thanks to a unique shared memory and vertical scale architecture — make it suitable for workloads such as databases and systems of reco
Tags : 
    
IBM
Published By: CA Technologies EMEA     Published Date: May 25, 2018
Project portfolio management (PPM) software suites have evolved significantly in just the last few years. No longer simply straightforward task–consolidation tools, they have become powerful enterprise solutions capable of everything from investment planning and management to collaboration and workflow automation across concurrent initiatives.
Tags : 
power, data, organization, project, portfolio, management
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: May 25, 2018
Les solutions logicielles de gestion de portefeuilles de projets (PPM, Project Portfolio Management) ont énormément évolué ces dernières années. Bien éloignées aujourd’hui des simples outils de consolidation de tâches qu’elles étaient par le passé, elles sont devenues de puissantes solutions d’entreprise aux capacités multiples, de la planification et la gestion des investissements à la collaboration, en passant par l’automatisation des workflows sur différentes initiatives simultanées.
Tags : 
liberez, puissance, donnees, votre, entreprise
    
CA Technologies EMEA
Published By: Oracle     Published Date: Mar 22, 2018
s your information technology (IT) organization pressured to get more work done with fewer people or on a constricted budget? Do you need to make IT a competitive asset rather than a cost center? Does your business struggle with slow software applications or data that's too often unavailable? If you answered "yes" to any of these questions, it's time to take a close look at Oracle Exadata, the world's fastest database machine exclusively designed to run Oracle Database. It is the first database machine optimized for data warehousing, online transaction processing (OLTP), and database consolidation workloads as well as in-memory databases and database as a service (DBaaS).
Tags : 
    
Oracle
Published By: Pure Storage     Published Date: Mar 15, 2018
The all-flash array (AFA) market has undergone significant maturation over the past two years. A high percentage of customers have already committed to an "all flash for primary storage" strategy, and every customer interviewed for this study was among those. In 2017, AFAs will drive over 80% of all primary storage revenue. All of the established storage vendors have entered this space, and there are several start-ups with over $100 million in revenue. With this level of market maturation, multiple segments have developed within the primary flash array space. There are systems targeted for dedicated application deployment, there are systems specifically for web-scale applications, and there are systems intended for dense mixed workload consolidation. These latter systems are driving most of the AFA revenue, and they aspire to become the primary storage platforms of record for enterprises of all sizes. This study evaluates the suitability of 10 vendors' AFA platforms for dense mixed ent
Tags : 
    
Pure Storage
Published By: SAS     Published Date: Mar 06, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics, and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. With the right end-user tools, a data lake can enable the self-service data practices that both technical and business users need. These practices wring business value from big data, other new data sources, and burgeoning enterprise da
Tags : 
    
SAS
Published By: NetApp     Published Date: Mar 05, 2018
Read this vendor assessment from IDC and find out about the suitability of 10 vendors' AFA platforms for dense mixed enterprise workload consolidation. Discover the areas where the most differentiation between vendors was noted, including their strategies around NVMe and cloud-based predictive analytics.
Tags : 
netapp, database performance, flash storage, data management, cost challenges
    
NetApp
Published By: Host Analytics     Published Date: Mar 01, 2018
While Oracle Hyperion remains a market leader in EPM software, that role comes with a hefty price tag and poses some hidden risks that have companies that use it re-evaluating their planning, budgeting, and consolidation needs. Whether used individually or together as an EPM suite, Hyperion Financial Management (HFM) and Hyperion Planning burden finance departments with a high cost of ownership, from server costs and consultants to dealing with the complex integration between products and the different interfaces. As Oracle Hyperion users start to evaluate the cloud, they need to be aware of how benefits of the cloud-based EPM solution compare to on-premises software. We’ve compiled this information to a) help you better understand the full costs and potential risks associated with Oracle Hyperion and b) offer guidance as you evaluate cloud-based options.
Tags : 
    
Host Analytics
Published By: Riverbed     Published Date: Feb 15, 2018
This Enterprise Management Associates® (EMA™) research summary, sponsored by Riverbed®, highlights some of the key findings of EMA’s landmark report, “Network Management Megatrends 2016: Managing Networks in the Era of the Internet of Things, Hybrid Cloud, and Advanced Network Analytics.” It examines several major areas of change and evolution affecting network management. These “megatrends” include hybrid cloud networking, the Internet of Things (IoT), advanced network analytics, network management outsourcing, and network management tool consolidation.
Tags : 
    
Riverbed
Published By: Riverbed     Published Date: Jan 25, 2018
This Enterprise Management Associates® (EMA™) research summary, sponsored by Riverbed®, highlights some of the key findings of EMA’s landmark report, “Network Management Megatrends 2016: Managing Networks in the Era of the Internet of Things, Hybrid Cloud, and Advanced Network Analytics.” It examines several major areas of change and evolution affecting network management. These “megatrends” include hybrid cloud networking, the Internet of Things (IoT), advanced network analytics, network management outsourcing, and network management tool consolidation.
Tags : 
    
Riverbed
Published By: SAS     Published Date: Oct 18, 2017
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
    
SAS
Published By: Pure Storage     Published Date: Oct 09, 2017
Cloud computing and all-flash storage are two of the most important innovations driving next-generation IT initiatives. While it may seem at first that these are parallel trends, in reality they are inextricably intertwined. Without the benefits of all-flash storage —driving new levels of performance, agility and management simplicity — enterprises would not be able to modernize their infrastructures to deliver cloud services. It is no coincidence that the largest hyperscale cloud providers rely on all-flash storage solutions as their storage foundation. Pure Storage all-flash storage arrays provide enterprise customers with a safe, secure and smooth path to the all-flash cloud. You can take the journey in stages, starting small with a single application or two, and then adding more applications through consolidation and virtualization. You can also implement multiple stages at once.
Tags : 
data management, data system, business development, software integration, resource planning, enterprise management, data collection
    
Pure Storage
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search Offers      
Get your company's offers in the hands of targeted business professionals.