Vous êtes sur la page 1sur 97

Cloud Gaming on Azure Using FileZilla

Thesis

APRIL 22, 2019


NAGPUR UNIVERSITY
Amravati road
A

PROJECT REPORT

ON

"Cloud Gaming on Azure Using FileZilla"


This project report is submitted to Rashtrasant Tukadoji Maharaj Nagpur
University in partial fulfillment of the requirement for the award of degree of

Post Graduate Diploma


In
Cloud Technology

Submitted By
Roshan Markande
Shubham Tankhiwale
Harish
Kamini Nimje

Under the Guidance of


Prof. Swadeep Gajbhiye

DEPARTMENT OF CLOUD TECHNOLOGY


RASHTRASANT TUKADOJI MAHARAJ NAGPUR UNIVERSITY NAGPUR
Year 2018-19
Rashtrasant Tukadoji Maharaj Nagpur University, Nagpur Post
Graduation Diploma in Cloud Technology

CERTIFICATE
This is to certify that there port entitled

“ Cloud Gaming on Azure Using FileZilla ”


Is bonafide work and it is submitted to the Rashtrasant Tukadoji
Maharaj Nagpur University, Nagpur
By

Roshan Markande

Shubham Tankhiwale

Harish

Kamini Nimje

Mr. Swadeep Gajbhiye


Project Guide External Examiner
ACKNOWLEDGEMENT

Completion of this mini project “Cloud Gaming on Azure Using FileZilla” has given
immense pleasure and knowledge to us.

We have no words to express our sincere thanks for valuable guidance extended
to me by our project guide in completion of project. Without this it was very
difficult task.

We would like to give our sincere thanks to Prof. Dilip Saini, Mentor of Post
Graduate Diploma in cloud Technology Department for necessary help and
providing us the required facilities. We would also like to express our gratitude to
Mr. Swadeep Gajbhiye, Project Guide and teacher who inspire us a lot.

We would also like to thank all faculty members and the staff who have directly
or indirectly helped us in completing the project.

Projectee:
Roshan Markande
Shubham Tankhiwale
Harish
Kamini Nimje

Second Semester (PGDCT)


Cloud Technology
DECLARATION

We hereby declare that project entitled “Cloud Gaming on Azure Using FileZilla”

Undertaken at RTMNU’S Oberoi Centre for Excellence, in partial fulfillment of

Post-Graduation Diploma In Cloud Technology (Semester II) Examination, is our

original work and the Project has nor formed the basis for the award of any

degree, associate ship, fellowship or any other similar titles, either in

Rashtrasant Tukdoji Maharaj Nagpur University or any other University of India.

Submitted By

Roshan Markande
Shubham Tankhiwale
Harish
Kamini Nimje
CONTENTS

Title Page…………………………………………………………………………..i

Certificate………………………………………………………………...........ii

Acknowledgement……………………………………………………………iii

Declaration……...……………………………………………………………….iv

Contents…………………………………………………………………..........vi

List of Figures…………………………………………………………………..vii

Abstract…………………………………………………………………...........ix

Chapter: 01 Synopsis…………………………………………………………01
Chapter: 02 Introduction
2.1 Overview………….……………………………………………….23
2.2 FileZilla (FTP/S) .......................................................24
2.3 Uses of FTP/S …………………………………………………28
2.4 Cloud Computing…………………………………………..31
2.5 Microsoft Azure

Chapter: 03 Problem Definition and Scope


3.1 Problem Statement………………………………………...45
Chapter: 04 System Requirement
4.1 Hardware Requirement…………………………………...46
4.2 Software Requirement………………………………….....46
Chapter: 05 System Design & Implementation
5.1 Design…………………………………………………….47
5.2 Proposed System……………………………………….....54
Chapter: 06 Testing Specifications
6.1 Unit Testing……………………………………………....57
6.2 Integration Testing……………………………………....57
6.3 System Testing…………………………………………58
6.4 Acceptance Testing…………………………………….58
Chapter: 07 Advantages and Disadvantages
7.1 Advantages……………………………………………..59
7.2 Disadvantages………………………………………….59
Chapter: 08 Result and Analysis
8.1 Result…………………………………………………..60
8.2 Analysis / Output………………………………………60
Chapter: 09 Conclusion and Future Scope
9.1 Conclusion……………………………………………..67
9.2 Future Scope…………………………………………...67
Chapter: 10 References……………………………………………..69
Appendix
LIST OF FIGURES

a) Fig 1: Cloud gaming overview………………………………..3

b) Fig2.1.2: A generic cloud-based platform for online games…………… 7

c) Fig5.1.1:DFD level 0……………………………………................20

d) Fig5.1.2:DFD level1 GAME…………………………………………..21


e) Fig5.1.3: DFD Level 2 Server Game Engine…………… .............................. 22

f) Fig5.1.4: DFD Level3 Client Game Engine……………. ............................... 23


g) Fig5.2.1: Game play Use case……………………………................................. 24
h) Fig5.2.2: Game play Use case (AI Player)………………. ............................ 25
i) Fig5.2.3: Login Use case…………………………………. ....................................25
j) Fig5.2.4: Menu Interface Use case………………………. ..............................26
k) Fig5.2.5: Chat Use case…………………………………….27
l) Fig6.1: Internal flow of command…………………………38

m) Fig8.1: Http Request Monitoring…………………………..41

n) Fig8.2: Http Request monitoring as per demand…………..41


ABSTRACT

Everyone want play their favorite game in their device they have but in past time it was not

possible to everyone to play games on device they have , because configuration of the game and

graphics of the game does not support with the device so it cause in trouble. If user want to play

game first they should first installed game according to mobile, laptop and pc configuration.

Today technology like Microsoft Azure helps to publish and play any games without requirement

of graphics, device configuration. Microsoft Azure provide smart resources for gaming over

internet. We study the problem of optimally adopting ongoing cloud gaming session to maximize

the gamer experience in dynamic and challenging environments. The consider problem is quite

challenging because of the resource allocation require more space to optimization. Conduct

Microsoft azure cloud services over the live internet for online gaming to multiple devices is very

useful for all users. Enhancing the cloud gaming platform to provide better quality resources to

game users device to play high quality games with minimum resource requirement and low cost.

Keywords:

Cloud computing, cloud gaming, video coding, computer graphics, Remote gaming,
resource scheduling.
CHAPTER 1
INTRODUCTION

Everyone wants play their favorite game in their device they have. But in
past time it was not possible for everyone to play the games on their
device they have, because configuration of the game and graphics of the
game does not support for the devices it cause in trouble. If the user want
to play games they should first install games in mobile, laptop and pc
configuration. Today technology like virtualization and cloud helps to play
any games without requirement of cloud without graphics, device
configuration. Cloud gaming technology provide smart resources for
gaming over the internet.

Cloud Gaming:

Cloud gaming, sometimes called gaming on demand, is a type of online gaming. Currently there
Are two main types of cloud gaming: cloud gaming based on video streaming and cloud gaming based on
file streaming. Cloud gaming aims to provide end users frictionless and direct play-ability of games across
various devices.

1.1 Gaming on Microsoft Azure:

Build your game backend infrastructure

Azure provides you with choice and flexibility to build your game backend on cloud. For compute, you can
use IaaS offerings like Virtual Machine, VM Scale sets on Windows and Linux or leverage PaaS offerings like
Service Fabric and App Service. For data storage, you can use managed database services like Azure SQL
Database and Azure Document DB and also MongoDB and other options from Azure Marketplace.
Keep players engaged

Enable multiplayer scenarios and leaderboards with Azure Active Directory. For instance, manage
social identity providers such as Facebook, Google, and Microsoft. And manage player retention
and increase user engagement and monetization across platforms using Azure Notification
Hubs and Azure Media Services.

Crunch big data for deeper game analytics

Build a powerful end-to-end game analytics platform on Azure using tools from the Cortana Intelligence
Suite and big data solutions. Analyze mobile gamers’ behavior using services like Azure Machine
Learning and Azure Mobile Engagement to maximize app usage, user retention, and monetization.

Welcome to Azure Gaming…


1.2 Why is the particular project chosen?

Why we use Microsoft Azure for gaming...

Scale as your game grows


Dynamically scale your game experiences to over 50 global regions—more than any other cloud
provider—on demand. Deliver your game to consoles, PCs, mobile devices and the web. Powering
games like Minecraft, Halo and Sea of Thieves, Azure enables built-in virtual networking and load
balancing. Plus, automatically sync between cloud and on-premises systems.

Unmatched security and reliability


Preserve the latency and integrity of your most demanding gaming workloads with the cloud that
handles billions of game events each day. Protect your game from DDOS attacks with a self-healing
platform which provides backups, rollbacks and failovers—bolstered by a 99.99 percent availability
SLA.
Speed game development and amplify player engagement
Build and operate games with a single platform. Azure FileZilla gives you everything you need—
managed game services, data analytics and LiveOps tools—to launch faster, extend your game’s
lifecycle and reduce your costs.

Microsoft Azure for the Gaming Industry


Cloud computing is increasingly important for today’s global gaming ecosystem, empowering developers of
any size to reach gamers in any part of the world. Azure’s 54 datacenter regions, and its robust global
network, provides globally available, high performance services, as well as a platform that is secure, reliable,
and scalable to meet current and emerging infrastructure needs. For example, earlier this month we
announced the availability of Azure South Africa regions. Azure services enable every phase of the game
development lifecycle from designing, building, testing, publishing, monetizing, measurement, engagement,
and growth, providing:

 Compute: Gaming services rely on a robust, reliable, and scalable compute platform. Azure customers can
choose from a range of compute- and memory-optimized Linux and Windows VMs to run their workloads,
services, and servers, including auto-scaling, micro services, and functions for modern, cloud-native games.

 Data: The cloud is changing the way applications are designed, including how data is processed and stored.
Azure provides high availability, global data, and analytics solutions based on both relational databases as
well as big data solutions.

 Networking: Azure operates one of the largest dedicated long-haul network infrastructures worldwide, with
over 70,000 miles of fiber and sub-sea cable, and over 130+ edge sites. Azure offers customizable
networking options to allow for fast, scalable, and secure network connectivity between customer premises
and global Azure regions.

 Scalability: Azure offers nearly unlimited scalability. Given the cyclical usage patterns of many games, using
Azure enables organizations to rapidly increase and/or decrease the number of cores needed, while only
having to pay for the resources that are used.
 Security: Azure offers a wide array of security tools and capabilities, to enable customers to secure their
platform, maintain privacy and controls, meet compliance requirements (including GDPR), and ensure
transparency.

 Global presence: Azure has more regions globally than any other cloud provider, offering the scale needed
to bring games and data closer to users around the world, preserving data residency, and providing
comprehensive compliance and resiliency options for customers. Using Azure’s footprint, the cost, the time,
and the complexity of operating a game at global scale can be reduced.

 Open: with Azure you can use the software you choose whether it be operating systems, engines, database
solutions, or open source – run it on Azure.

We’re also excited to bring FileZilla into the Azure family. Together, Azure and FileZilla are a powerful
combination for game developers. Azure brings reliability, global scale, and enterprise-level security, while
FileZilla provides Game Stack with managed game services, real-time analytics, and comprehensive LiveOps
capabilities.

We look forward to meeting many of you at GDC 2019 to learn about your ideas in gaming, discussing where
cloud and cloud-native technologies can enable your vision, and sharing more details on Azure for gaming.
Join us at the conference or contact our gaming industry team at azuregaming@microsoft.com.

Details on all of these are available via links below.

 Learn more about Microsoft Game Stack.

 Talks at GDC:

 Thursday, March 21, 2019 at 11:30 AM: Best Practices for Building Resilient, Scalable, Game Services
in Microsoft Azure

 Thursday, March 21, 2019 at 12:45 PM: Save Time for Creativity: Unlocking the Potential for Your
Game's Data with Microsoft Azure
 Azure Gaming Reference Architectures: Landing Page

 Multiplayer/Game Servers

 Analytics

 Leaderboards

 Cognitive Services
 GDC Booth demos for Azure:
 AI Training with Containers – Use Azure and Kubernetes to power Unity ML Agents

 Game Telemetry – Build better game balance and design

 Build NoSQL Data Platforms – Azure Cosmos DB: a globally distributed, massively scalable NoSQL
database service

 Cross Realms with SQL – Build powerful databases with Azure SQL
Get Started with Azure for Gaming
Modern games require more powerful development tools, global and flexible multiplayer support,
and new revenue models. But you’re here to build worlds, not back ends. Let Azure manage your
platform so you can focus on making games that make headlines.

Some advantages of building your own game services and backend from scratch include:

 Finer control over the backend services and data that are running your game.
 Creation of a custom solution or features to run your game that existing services do not
provide.
 Optimizing costs by paying for only what you use each month.

Getting Started
o Create your Azure free account
o General Guidelines

Reference Architectures
o Multiplayer / Game Servers
o Analytics
o Leaderboards
o Cognitive Services

Reference Architectures
The reference architectures will help start you on the path of building gaming services for your
game. Each architecture is composed of:

 A highly abstract diagram showing the different pieces and how they interact with each
other.
 A deployment template and installation steps, to help you get started quickly with the
implementation of the architecture.
 A list of considerations to give you a sense of the scope of requirements covered in the
architecture.
 In most cases, a sample project, so you can quickly test the deployed infrastructure in your
own Azure account.
 A high-level step by step guide and implementation details to help you understand the
sample.

Multiplayer Backend Reference Architectures


These reference architectures describe a variety of multiplayer backend use cases and
implementations with different alternatives, enabling you to architect a cloud solution that works for
your game.
Use Cases
There are many variables which can be taken into consideration when designing a
multiplayer backend for your game. Here are some examples:

1. Level of management - From putting all the effort on yourself to letting the platform take
care of everything
2. Operating system - Windows or Linux
3. Where does the game session run - Dedicated server or peer-to-peer (P2P)
4. Hardware - What does the game session need to be able to run
5. Processing time - Real-time or non-real time (NRT)
6. Latency - Players will be at a disadvantage if they have lag or they won't
7. Persistence - The game world continues to exist and develop internally even when there are
no players interacting with it, or each session has it's own beginning and end
8. Number of concurrent players - Small, medium or large
9. Reconnection allowed - If a player or players get disconnected, can they go back to the
game or they have to start a new game session

Following are some multiplayer backend use cases for you to explore:

 Multiplayer matchmaker
 Synchronous multiplayer
 Asynchronous multiplayer

Basic Game Server Hosting on Azure

This reference architecture details the steps to setup a basic Azure backend that will host a game server on
either Windows or Linux, using Minecraft server as an example.
Architecture diagram:

Azure regions/datacenters:

Azure has more global reason than aby other cloud provider-offering the scale needed to bring
applications closer to user around the world, preserving data residency, and offering
comprehensive compliance and resiliency options for customer.

Azure Global Regions…


Understand Azure global infrastructure

Regions
A region is a set of datacenters deployed within a latency-defined perimeter and connected
through a dedicated regional low-latency network.
With more global regions than any other cloud provider, Azure gives customers the flexibility to
deploy applications where they need to. Azure is generally available in 44 regions around the
world, with plans announced for 10 additional regions.

Geographies
A geography is a discrete market, typically containing two or more regions, that preserves data
residency and compliance boundaries.
Geographies allow customers with specific data-residency and compliance needs to keep their data
and applications close. Geographies are fault-tolerant to withstand complete region failure through
their connection to our dedicated high-capacity networking infrastructure.

Availability Zones
Availability Zones are physically separate locations within an Azure region. Each Availability Zone is
made up of one or more datacenters equipped with independent power, cooling, and networking.
Availability Zones allow customers to run mission-critical applications with high availability and low-
latency replication.

Overview of Cloud Game Infrastructure

Cloud gaming, sometimes called gaming on demand, is a type of online gaming. Currently there are two
main types of cloud gaming: cloud gaming based on video streaming and cloud gaming based on file
streaming. Cloud gaming aims to provide end users frictionless and direct play-ability of games across
various devices.

This solution provides an overview of common components and design patterns used to host game
infrastructure on cloud platforms.

Video games have evolved over the last several decades into a thriving entertainment business. With the
broadband Internet becoming widespread, one of the key factors in the growth of games has been online
play.

Online play comes in several forms, such as session-based multiplayer matches, massively multiplayer virtual
worlds, and intertwined single-player experiences.

In the past, games using a client-server model required the purchase and maintenance of dedicated on-
premises or co-located servers to run the online infrastructure, something only large studios and
Publishers could afford. In addition, extensive projections and capacity planning were required to meet
customer demand without overspending on fixed hardware. With today's cloud-based compute resources,
game developers and publishers of any size can request and receive any resources on demand, helping to
avoid costly up-front monetary outlays and the dangers of over or under provisioning hardware.

Fig. 1. Cloud Gaming Overview

The design philosophy of Gaming Anywhere in Cloud

Extensibility:
Gaming anywhere adopts a modularized design. For example, platform-dependent components such as audio
and video capturing and platform independent components such as code cs and network protocols can be
easily modified or replaced.

Portability:
Gamers may use devices with heterogeneous architectures to access Gaming Anywhere, and thus it is
important to port Gaming Anywhere to as many platforms as possible. New platforms can be easily supported
by replacing the platform-dependent components in Gaming Anywhere.

Configurability:
System researchers may conduct experiments with diverse system parameters. A large number of built-in
audio and video code cs are supported by Gaming Anywhere.
In addition, Gaming Anywhere exposes all available configurations to gamers so that it is possible to try out
the best combinations of parameters by simply it in text-based configuration file for various usage scenarios.
CHAPTER 2
PROBLEM DEFINATION AND SCOPE

A CLOUD-BASED PLATFORM TO MASSIVIZE ONLINE GAMING:


We discuss the requirements and the design of a generic platform to
massive OGs.

2.1.1. Requirements for Cloud-Based Gaming Platforms


We identify three main requirements.
1. Virtual-World Management:
The cloud-based game platform should be scalable to serve millions of players
online, match elastically the number of players, be always available, and be
consistent and low latency.

2. Game-Data Processing:
The massive number of players in the virtual world leads to massive amounts of
data: user interactions, uploaded screenshots and videos, social networking, etc.
Analyzing these data can help the system designers to understand player behavior
and to gain insight into system operation, thus allowing them to build better
games for the players. Cloud-based data processing could enable an elastic, and
thus efficient, platform for time-based and graph analytics.

3. Game-Content Generation:
Game content, from bits such as textures to abstract puzzles and even entire
game designs, is at the core of the entertainment value of games. Until the early
2000s, manual labor has ensure that the quality and quantity of game content
matched the demands of the playing community, but is not scalable due to
exponential growth in number of users and production costs. A cloud-based game
platform providing elastic, cost-effective, procedural generation of player
customized content, could lead to a vast improvement over the capabilities of
today’s generation of games.

2.1.1. Cloud-Based Platform Architecture


The cloud infrastructure is closely related to its architecture & comprises of many cloud component which
is loosely connected.
The broad divisions of cloud architecture are:

 Front-end
 Back-end

It is the responsibility of the back-end to provide the security of data for cloud users along with the traffic
control mechanism. The server also provides the middleware which helps to connect devices &
communicate with each other.

Figure - Cloud Computing Architecture

Businesses used cloud infrastructures to work with these applications. Unlike subscription-based models of
pricing, payment structure of the cloud enables the user to subscribe to vendor services & cloud
infrastructures are paid on a 'pay-per-use' basis.

The cloud technology architecture also consists of front-end platforms (as read in the early chapters) called
the cloud client which comprises servers, thin & fat client, tablets & mobile devices. The interaction is done
through middleware or via web-browser or virtual sessions. According to Jason Bloomberg of Zap Think, the
cloud-oriented architecture can essentially be the building block of IoT (Internet of Things) in which anything
can be connected to the internet. The cloud architecture is a combination of both services oriented
architecture & event-driven architecture. SO cloud architecture encompasses all elements of the cloud
environment.
Cloud Infrastructure:

The cloud technology also has a specific type of infrastructure that allows it to give so much advantage to its
users. The cloud computing as a whole is a combination of different hardware & software that makes the
working of cloud technology utterly wonderful.

Defining Cloud Infrastructure:

It refers to the software along with the hardware components such as storage drive, hardware, servers,
virtual software, other cloud management software, and other networking devices; all work together to
support the computing requirement of the cloud computing model. Moreover, the cloud technology holds a
software abstraction layer that virtualizes the cloud resource & presents them to users locally.

Cloud Infrastructure Management Interface (CIMI) is an open standard API that is used to manage the cloud
infrastructure. It enables its users to handle all the cloud infrastructure easily by providing a means to
interact with the provider & their consumer or developer.

Figure - Components of Cloud Infrastructure

The hypervisor can be defined as the firmware (a permanent set of instruction or code programmed into the
read-only memory & is a low-level program) that acts as a manager for the virtual machine. It is also called
Virtual Machine Monitor (VMM) which creates & runs the virtual machine. It provides the guest OS with a
virtual operating platform to manages the execution of other applications. There are two types of the
hypervisor.
The limitations of cloud technology concerning infrastructure are:

 Scalability
 Intelligent Monitoring
 Security

To address the three main requirements, we propose the architecture of a cloud-based platform depicted in
Figure 1. The architecture is based on three pillars: virtual world management, game-data processing, and
game-content generation. Responding to Requirement 1, the virtual-world management pillar addresses
game hosting and the management of players in the virtual world. The game-data processing pillar
addresses Requirement2: it analyzes time-based and graph data corresponding top layer sand their games.
Focusing on Requirement3, the content generates player-customized content at massive scale. The virtual-
world management pillar provides in-game data to the game-data processing pillar and uses content
produced by the game-content generation pillar. We describe the challenges and opportunities of the
virtual-world management pillar inSection3, and focus on the other two pillars in the remainder of this
section. We focus on systems challenges; others, such as finding new ways to use data and to generate
content, fall outside the scope of this work.

Fig.2.1 – A generic cloud-based platform for massivizing online games. The three pillars are virtual-world
management, gamed at a processing, and game-content generation
2.2.3: CHALLENGES AND OPPORTUNITIES IN MICROSOFT CLOUD
MANAGEMENT
Overcoming the Challenges of Microsoft Azure:

This is an interesting time for organizations that are heavily invested in Microsoft
technologies.

If you haven’t yet moved to the cloud, you’re probably seriously considering
Azure. And if you have already moved to the Azure cloud, you may be grappling
with some very real management and resource challenges.

With help from leading industry analysts, we’ve compiled a wealth of data and
found some interesting insights into Azure. We took that information and built an
easy-to-read infographic, aptly named “Overcoming the Challenges of Microsoft
Azure.” You can check it out at the bottom of this post.

The first finding: you’re in good company. We found that 65 percent of IT decision
makers are seriously considering Azure as their public cloud platform. And most of
those considering Azure share concerns around three key areas: maintenance,
security and the IT skill set required to run Azure.

But the biggest challenges still stem from balancing a high volume of Azure
maintenance needs against finite IT resources. From architecture design to
patching and database administration, there are about 14 different administrative
areas that need frequent attention. That translates into budget dollars that aren’t
going into innovation or new technology investments. What’s more, if you’ve
been searching for qualified Azure specialists, you know they’re increasingly hard
to find.

We hope you’ll check out the infographic below. You may find it especially helpful
if you’re building a business case for outsourcing Azure management, or if you
need to identify the key factors to weigh while building an in-house team (such as
the cost of Azure-related expertise).

This infographic is the first of a series of blog posts we’re developing that will dive
deeper into the four key challenges highlighted in the infographic:

1. The burden of maintaining Azure


2. The cost of Azure-related expertise
3. The critical need to ensure security
4. Microsoft and Rackspace: A perfect match
FileZilla claims there is not enough disk space on our Microsoft Azure webserver
We run a website on a Microsoft azure webserver, but today we face a problem when trying to upload files
to that webserver using FileZilla FTP client. Whenever I try to upload anything, FileZilla responds with "550
There is not enough space on the disk". I then log on the administrator account on for Microsoft azure, and

See that we have only spend 2% of our 50 GB available disk space. I tried restarting the webserver, but that
did not do the trick. I see some people mention some sort of .temp file, which could possibly take some disk
space, however, I can't seem to locate such file anywhere in the FTP directory. I am short on ideas, so any
help would be appreciated.

Azure Migration Challenges? Here's 10 ways to overcome them


2.3 PROPOSED SYSTEM
Microsoft Azure is an ever-expanding set of cloud services to help your organization meet your business
challenges. It is the freedom to build, manage and deploy applications on a massive, global network using
your favorite tools and frameworks.

It is the cloud for all


We believe that the success made possible by the cloud must be accessible to every business and every
organization —small and large, old and new.

Why Azure is the right choice

Productive
Reduce marketing cycles by delivering features faster with more than 100 end-to-end services.

Use any tool, language or framework


Quickly turn ideas into solutions to get up and running fast—just bring your code. Build applications
with the language of your choice, including Node.js, Java, .NET and use what you already know and
love.
Work with best-in-class development tools for PC or Mac, such as Visual Studio and Visual Studio
Code, to increase your productivity with features that let you focus on what matters most: writing
great code.
Get mobile apps into the hands of users faster by streamlining the mobile development lifecycle
with Visual Studio App Center, including automated builds and testing for cross-platform, hybrid
and native apps on iOS and Android.

The only consistent and comprehensive hybrid cloud

Go beyond connecting your datacenter to the cloud. Take advantage of the broadest set of hybrid
capabilities and deliver true hybrid consistency in your applications, data, identity, security and management
across on-premises and cloud environments.

Build solutions on the platform that is hybrid by design

Create a truly consistent experience across your hybrid cloud using comprehensive Azure cloud capabilities.
Reduce complexity and risk with the platform, tools and services designed to work together across your on-
premises and cloud environments. Build and deploy your applications consistently, seamlessly manage data,
enable anywhere access with single sign-on and deliver integrated security and management across on-
premises and the cloud.
Develop breakthrough apps with built-in intelligence
Take advantage of a comprehensive set of services, infrastructure and tools to build AI-powered
experiences. Build bots that naturally interact with users and built-in advanced analytics tools to make faster
predictions.

Use a rich set of Azure data and AI services such as Azure Data bricks, Azure Cosmos DB, Azure Cognitive
Services and Azure Bot Service to enable new experiences in your apps for human-like intelligence.

Rely on managed service capabilities such as built-in monitoring, threat detection, automatic patching and
backups.

Streamline compliance and transform your business

Help ensure you are compliant with industry-specific standards by taking advantage of the certifications
offered by the Microsoft Cloud—the platform with the most comprehensive compliance portfolio of any
cloud provider. Simplify your organization’s compliance with offerings that provide built-in controls,
configuration management tools and implementation and guidance resources. Use third-party audit reports
to verify that your cloud assets adhere to the strict security controls that industry and government standards
mandate

“We did look at different platforms, including Azure, Google, Amazon,


and VMware. If we wanted to put the whole city in the cloud, we needed
Azure.”

ASOS, a leading online fashion retailer, transformed its platform from a monolithic, on-premises e-
commerce system to a micro services platform running on Microsoft Azure. In 2016, the new platform
handled more than double the volume of Black Friday orders from the previous year. The rapidly growing
company with 15.4 million customers has also accelerated development of innovative mobile apps and
features to quickly target new markets and stay on top of consumer and technology trends.
Onboarding developers in hours instead of week…

Daimler AG, one of the world’s largest manufacturers of premium cars and trucks, is driving hard to be a key
player in software. To speed up software development and thus innovation, Daimler uses Microsoft Azure
DevTest Labs. By developing in Azure, the company can onboard developers in hours versus weeks, get new
ideas underway faster, and attract top talent with a state-of-the-art development environment. Peter
Rothlaender, Manager of Cloud Solutions at Daimler, explains why Daimler is honking its horn about Azure
DevTest Labs.

Architecture of Microsoft Azure environment


This topic describes how Kentico works within the Microsoft Azure environment using Azure Cloud
Services, which features of the service are utilized and how the application stores and manages its
data in the cloud.

Application
If you choose to install Kentico to Microsoft Azure, all files will be grouped into a solution based on Visual
Studio's Microsoft Azure template. The solution contains several projects. One of them is a web application,
which encompasses almost all the functions of Kentico and is designed to run as an Azure ASP.NET Web
role.
The Smart search worker, is separated from the web application in another project because it cannot run
together with the application as the Web role. To index content of websites correctly and effectively, the
Smart search worker runs as an Azure Worker role.
Because the application is divided into these two services, you also need to configure them separately.
See Configuring an Azure project.
On this page
 Application
 Database
 File storage
 Multiple web role instances
 Storing session state
 Alternative approaches
o Microsoft Azure Web Sites or Virtual Machines
Related pages
 Requirements and limitations for running Kentico on Microsoft Azure
 Microsoft Azure project development lifecycle
 Working with physical files using the API

Database
Kentico on Microsoft Azure uses an Azure SQL relational database. This database engine is almost identical
to the standard SQL Server engine, with only a few limitations. These limitations are taken into account in
Kentico, and no additional configuration or customization is required. If you're interested in which SQL
Server features are not available in Azure SQL, refer to SQL Server Feature Limitations (Azure SQL
Database) on MSDN.

File storage
Microsoft Azure does not offer a persistent file system similar to the file systems in Windows that you are
used to. Data stored within Azure cannot be hierarchically organized into folders. However, Kentico provides
an abstract layer called CMS.IO, which enables the system to operate on different types of file storages.
See Working with physical files using the API for more information.
The CMS.IO namespace acts as an intermediary between the Kentico business layer and various file
storages, including Azure blob storage. On a standard non-Azure installation, CMS.IO only overrides the
System.IO namespace. On Microsoft Azure, the namespace uses a provider which works with the blob
storage, creating an imitation of the regular Windows file system. The CMS.IO namespace can be extended
to support any other type of storage, e.g. the Amazon cloud drive.
Additionally, you can make use of the Azure storage provider and store files in the cloud even if you're
running a non-Azure installation. You can find more information about this approach in Configuring Azure
storage.
The file storage is shared across multiple web role instances, therefore, no file synchronization is needed.

Multiple web role instances


Kentico application can run in multiple instances of one web role on Microsoft Azure. Therefore, the data
must be synchronized between these instances. Kentico handles the synchronization by considering each
instance a web farm server. When you set the number of instances, the web farm servers are configured
automatically. Also, web farm tasks are created and executed automatically so no manual configuration is
needed.
When you need to optimize cache performance, you can create a Cache service on the Azure Management
Portal and configure it to synchronize cache. See Storing session state information in Azure Cache Service for
details.

Storing session state


Every complex web application needs to store information about its state, especially user session data. Since
the Azure environment is dynamic and the application does not reside constantly in one place, its state has
to be stored separately. When your application uses only one web role instance, the session state data can
be stored on that instance. If your application uses two or more web role instances, you need to configure
Your project to store the session state data elsewhere. We recommend these options for storing the session
state data:

 In Microsoft Azure SQL Database - easy to set up, suitable for small projects or projects with mostly read
access to web pages – see Storing session state information in Azure SQL database.
 In Microsoft Azure Cache - more suitable for larger projects than the Azure SQL database option, but
requires configuration of Azure Cache – see storing session state information in Azure Cache Service.

Alternative approaches
This section describes the setup and deployment of Kentico to an Azure Cloud Service, as this service is the
best option for most projects. However, there are other alternative environments, in which Kentico is
supported as well.

Microsoft Azure Web Sites or Virtual Machines


Web Sites are quick to deploy and easy to set up and manage. However, they do not offer as much
customization options as cloud services. See Microsoft Azure Web Sites for information on how to set up
Web Sites.
Virtual Machines allow you to configure and maintain a virtual server in the cloud. They offer more control
but require complex setup. You can start using the Virtual Machines by creating one in the Azure
Management Portal.
For a more detailed comparison of all three services provided by Microsoft Azure, see Azure Web Sites,
Cloud Services and Virtual Machines comparison on MSDN.

Microsoft Azure project development lifecycle

When you develop projects on Microsoft Azure, in typical cases you want to begin with a small size of the
project, which uses the least resources possible. Then, as your project grows, you configure your project to
utilize more resources to accommodate the performance and size requirements of the project. This topic
presents main levels of development on Microsoft Azure and provides links to related configuration tasks,
which you need to perform when ascending to a higher level.

Level 0 - Local development


You can choose to begin developing your Azure project locally in an emulator before deploying it to the
cloud environment. In this case, set up a database and Azure Storage service and configure the web role's
settings. See Developing Azure projects locally.

Level 1 - Development
For the duration of project development, one instance of CMSApp web role is usually enough. To configure
a project to use one web role instance, perform the basic configuration tasks.

Level 2 - Production with SLA


When you deploy your project and switch to the production environment, you may want to qualify
for Microsoft SLA. In such case, your project must use at least two instances of the CMSApp web role. When
you increase the number of used web role instances, you need to adjust your project to synchronize the
data between the instances and to store session state information.
For data synchronization, you can use the default web farm synchronization tasks, in which case you do not
need to configure anything.
For storing session state information, you can use Microsoft Azure SQL Database – see Storing session state
information in Azure SQL Database.

Level 3 - Performance
When the performance of the level 2 environment is not sufficient, you can configure the Azure Cache
Service to store session state information – see Storing session state information in Azure Cache Service.

Level 4 - Scalability
When you need even more power, you can further scale your project using the following approaches:

 Utilize larger cloud services


 Use more web role instances
 Configure auto scaling

On this page
 Level 0 - Local development
 Level 1 - Development
 Level 2 - Production with SLA
 Level 3 - Performance
 Level 4 - Scalability

Related pages
 Developing Azure projects locally

Converting an existing project to an Azure project


Microsoft Azure Web Sites
Microsoft Azure Web Sites are quick to deploy and easy to set up and manage. However, they do not offer
as many customization options as cloud services. For a more detailed comparison, see Azure Web Sites,
Cloud Services and Virtual Machines comparison on MSDN.

Create and Publish a Webpage in Azure.

Prerequisites:

To learn Windows Azure, you need to be familiar with the Windows environment and have a basic
knowledge of cloud computing. To publish a webpage in Azure you have to be a basic knowledge of Visual
Studio and must have to know about how to make a webpage in Visual Studio 2019.

Procedure:

Steps:

1) First of all open Visual Studio 2019.


2) Create a new project and select Asp.Net Core Web Application and hit next.

3) After that configure your new project by setting project name and location and hit create.
4) Select Web Application (Model-View-Controller) and hit create.

5) Now your My First App is created.

6) Now to run your website get IIS Express Certificate and Run IIS Express.
Your Webpage shown like this.
7) Now start to publish.

8) Pick a publish target & create a new & hit publish.

9) Create App Services for that login with your Azure Portal and hit create.
10) Now Publish your Web Site.

After Publishing you got a message that “Azure Successfully Configured”


Now copy your Site URL link and go to your Azure Portal.
Now open the browser on chrome and paste that URL link and browse it.
Now you got your Asp.Net Website/Webpage in Azure Portal like that.

11) Go to your Visual Studio 2019 project open your project and make some changes now.
For that go to your “index.cshtml” page and change your code like this.
12) Go to Solution Explorer write click your project and hit publish.

13) Run your webpage where you make changes.


After changes your webpage will be look like this.

14) Go to your Azure Portal and go to MyFirst App Service.

15) Go to your MyFirst Web Service Overview.

Copy URL and Browse it.


16) After Browsing your website will look like this.

Now your website is successfully published in Azure.

Create, monitor, and manage FTP files by using Azure Logic Apps
With Azure Logic Apps and the FTP connector, you can create automated tasks and workflows that create,
monitor, send, and receive files through your account on an FTP server, along with other actions, for
example:

 Monitor when files are added or changed.


 Get, create, copy, update, list, and delete files.
 Get file content and metadata.
 Extract archives to folders.

You can use triggers that get responses from your FTP server and make the output available to other actions.
You can use run actions in your logic apps for managing files on your FTP server. You can also have other
actions use the output from FTP actions. For example, if you regularly get files from your FTP server, you can
send email about those files and their content by using the Office 365 Outlook connector or Outlook.com
connector. If you're new to logic apps, review What is Azure Logic Apps?

Limits
 FTP actions support only files that are 50 MB or smaller unless you use message chunking, which let
you exceed this limit. Currently, FTP triggers don't support chunking.
 The FTP connector supports only explicit FTP over SSL (FTPS) and isn't compatible with implicit FTPS.

Prerequisites
An Azure subscription. If you don't have an Azure subscription, sign up for a free Azure account.

Your FTP host server address and account credentials

The FTP connector requires that your FTP server is accessible from the internet and set up to operate
in passive mode. Your credentials let your logic app create a connection and access your FTP account.

Basic knowledge about how to create logic apps


The logic app where you want to access your FTP account. To start with an FTP trigger, create a blank logic
app. To use an FTP action, start your logic app with another trigger, for example, the Recurrence trigger.

Connect to FTP
Before your logic app can access any service, you must create a connection between your logic app and that
service. If you didn't previously create this connection, you're prompted for connection information when
you add a trigger or action for that service to your logic app. The Logic Apps Designer provides an easy way
for you to create this connection directly from your logic app.

1. Sign in to the Azure portal, and open your logic app in Logic App Designer, if not open already.
2. For blank logic apps, in the search box, enter "ftp" as your filter. Under the triggers list, select the
trigger you want. -or-

For existing logic apps, under the last step where you want to add an action, choose new step, and
then select Add an action. In the search box, enter "ftp" as your filter. Under the actions list, select
the action you want.

To add an action between steps, move your pointer over the arrow between steps. Choose the plus
sign (+) that appears, and then select add an action.

3. Provide the necessary details for your connection, and then choose Create.
4. Provide the necessary details for your selected trigger or action and continue building your logic app's
workflow.

When requesting file content, the trigger doesn't get files larger than 50 MB. To get files larger than 50 MB,
follow this pattern:

 Use a trigger that returns file properties, such as when a file is added or modified (properties only).
 Follow the trigger with an action that reads the complete file, such as Get file content using path, and
have the action use message chunking.

Examples

FTP trigger: When a file is added or modified

This trigger starts a logic app workflow when the trigger detects when a file is added or changed on an FTP
server. So for example, you can add a condition that checks the file's content and decides whether to get
that content, based on whether that content meets a specified condition. Finally, you can add an action that
gets the file's content, and put that content in a folder on the SFTP server.

Enterprise example: You can use this trigger to monitor an FTP folder for new files that describe customer
orders. You can then use an FTP action such as Get file content, so you can get the order's contents for
further processing and store that order in an orders database.

When requesting file content, triggers can't get files larger than 50 MB. To get files larger than 50 MB, follow
this pattern:

 Use a trigger that returns file properties, such as when a file is added or modified
(properties only).
 Follow the trigger with an action that reads the complete file, such as Get file content using
path, and have the action use message chunking.
A valid and functional logic app requires a trigger and at least one action. So make sure you add an action
after you add a trigger.

Here is an example that shows this trigger: When a file is added or modified

1. Sign in to the Azure portal, and open your logic app in Logic App Designer, if not open already.
2. For blank logic apps, in the search box, enter "ftp" as your filter. Under the triggers list, select this
trigger: When a filed is added or modified - FTP
3. Provide the necessary details for your connection, and then choose Create.
4. By default, this connector transfers files in text format. To transfer files in binary format, for example,
where and when encoding is used, select Binary Transport.
5. Next to the Folder box, choose the folder icon so a list appears. To find the folder you want to monitor
for new or edited files, select the right angle arrow (>), browse to that folder, and then select the
folder.

Your selected folder appears in the Folder box.

Now that your logic app has a trigger, add the actions you want to run when your logic app finds a new or
edited file. For this example, you can add an FTP action that gets the new or updated content.

FTP action: Get content


This action gets the content from a file on an FTP server when that file is added or updated. So for example,
you can add the trigger from the previous example and an action that gets the file's content after that file is
added or edited.

When requesting file content, triggers can't get files larger than 50 MB. To get files larger than 50 MB, follow
this pattern:

 Use a trigger that returns file properties, such as When a file is added or modified (properties only).
 Follow the trigger with an action that reads the complete file, such as Get file content using path, and
have the action use message chunking.

Here is an example that shows this action: Get content

1. Under the trigger or any other actions, choose new step.


2. In the search box, enter "ftp" as your filter. Under the actions list, select this action: Get file content -
FTP
3. If you already have a connection to your FTP server and account, go to the next step. Otherwise,
provide the necessary details for that connection, and then choose Create.
4. After the Get file content action opens, click inside the File box so that the dynamic content list
appears. You can now select properties for the outputs from previous steps. From the dynamic
content list, select the File Content property, which has the content for the added or updated file.

The File Content property now appears in the File box.

5. Save your logic app. To test your workflow, add a file to the FTP folder that your logic app now
monitors.
2.4 LITERATURE SURVEY

FileZilla® Pro Announces support for Microsoft Azure Cloud Services


Expanded feature set provides Web developers and designers file access and transfer abilities
across growing number of Cloud protocols.

FileZilla®, the popular open source file transfer product targeted at administrators, developers,
engineers and power users, today announced FileZilla Pro native support for Azure Files and Azure
Blob Storage.

The FileZilla product roadmap is continuously expanding beyond FTP-like protocols to allow Web
designers, editors, and Webmasters to easily access even more cloud services.

Through new protocol support, users can now access and transfer their files in even more scenarios
through FileZilla’s reliable, powerful user interface. The addition of Microsoft Azure Files and Azure
Blob storage offer FileZilla user’s ways to store their data in the cloud:

 Azure Blobs provides client libraries and a REST interface that allows unstructured data to be stored
and accessed at a massive scale in block blobs.
 Azure Files provides an SMB interface, client libraries, and a REST interface that allows access from
anywhere to stored files.

For in-depth description of the Azure storage protocols and the scenarios for when to use them,
visit https://docs.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks

FileZilla Pro allows users to easily share files across all types of and remote servers and computing
environments. In addition to Azure, FileZilla supported formats include Amazon S3, S3-compliant
cloud services, WebDAV among others. Future releases will include support for OpenStack, Google
Cloud Storage, and more.

Availability

FileZilla Pro can be downloaded from https://filezillapro.com, with Mac version available from the Apple App
Store https://itunes.apple.com/app/filezilla-pro/id1298486723 other versions are available from the FileZilla
website https://filezilla-project.org/

© FileZilla. All rights reserved. FileZilla and the FileZilla logo are registered trademarks in the USA and the
European Union. All other brands and trademarks are the property of their respective owners.

CONTACT:
Roberto Glopping
+1 315-544-0504
social@filezilla-project.org
Creating a High Available FTP Server on Microsoft Azure

About a year ago, at the announcement of the general Availability of Infrastructure as a Service I tested a
single box FTP server using FileZilla Server. The main benefit of FileZilla is it’s easy installation and
configuration.
Single Box configuration:
Following posts will guide you while creating a single box server using FileZilla:

http://digitalmindignition.wordpress.com/2012/11/28/azure-vm-role-with-filezilla-as-ftp-server
http://huberyy.wordpress.com/2012/08/03/set-up-a-ftp-server-in-windows-azure-virtual-machine-with-
filezilla-no-coding
Azure Virtual machines need maintenance from time to time, you should always avoid a single point of
failure, Enough reasons for a High Available configuration…This post does not consists of a step by step
guide showing you how to create a VNET, VM’s, DFS…
You need some experience with the Azure platform and Windows Server, I will help you putting the pieces
together for a High Available FTP Server running on the Azure platform.
High Available Topology:

 Dedicated Virtual Network.


 Two Virtual Machines, working as AD, DFS and FTP server.

 Azure availability set.


 Traffic will be routed through the Azure Load Balancer.
Topology remarks:
 I tried using a new VM feature, Public Instance IP’s for virtual machines. This feature is nice, but
unfortunately the Windows Explorer client can’t handle the IP change behind the scenes. “The IP address
supplied as part of the 227 response must be identical to the address of the FTP site the client initially
connected.” (http://stackoverflow.com/questions/18478594/ftp-clients-filezilla-and-ftpuse-work-but-
windows-explorer-cannot-engage-pasv-m).
PIP was announced as feature to support passive FTP, strange enough Windows Explorer can’t handle it.
 I’m using DFS (Distributed File System) as a High Available network share. I tried a currently in preview new
Windows Azure feature, Azure File Services, it’s very useful for shared storage between Virtual Machines. IIS
and FileZilla are not able to work with this feature, so it’s not useful for our purposes.
Virtual Network and Virtual Machines:
 Create a new Virtual Network, choose a region, create an affinity group,…
 When your Virtual Network has been provisioned, create two new Virtual Machines and add them both to
the VNET.

 When creating the first VM, create an availability set. When creating the second VM, join the availability set
you just created.

 Use the same cloud service name for the second VM as the one you defined at creation time of the first VM.

 When creating a new VM, the first thing you should do is changing the Windows Update and UAC settings.

 Attach an empty data disk to both Virtual Machines and format it. (Will be used for DFS and FTP file storage)
Active Directory:
 Install AD on both virtual machines. (http://azure.microsoft.com/en-us/documentation/articles/active-
directory-new-forest-virtual-machine)
 Create an admin user in the domain for future usage. (Domain admin).

 Set the private IP of the first VM as DNS server in the VNET.

 Add the second VM to the domain and promote it to a Domain Controller.

 Add the private IP of the second VM as DNS server in the VNET.

FTP and IIS:


 Install IIS and FTP service on both servers.

 Configure the FTP services (publish FTP services).

 Create a DFS share and set up Shared IIS config (you can use a shared config when doing the initial setup,
when you go live you will need to disable it due to the port settings).
You will find all the information to do this on following sites:
http://windowsazureguide.net/2013/09/29/how-to-create-highly-available-load-balanced-ftp-server-on-
windows-azure (This post is not complete, but contains useful information)
http://www.iis.net/learn/manage/managing-your-configuration-settings/shared-configuration_264
http://blogs.msdn.com/b/wats/archive/2013/12/13/setting-up-a-passive-ftp-server-in-windows-azure-
vm.aspx
FTP users and folders:
Should you have problems remembering how to configure user access in IIS, following posts will guide you?
If you work with a domain instead of for example local users, you need to create a folder with the domain
Name in IIS, don’t forget this!
https://community.rackspace.com/products/f/25/t/491
http://technet.microsoft.com/en-us/library/jj134201.aspx

Azure Load Balancer:


Open port 21 and load balance it between the two machines. Don’t forget to join the load balanced port on
the second virtual machine!
Now here is where the magic happens to enable passive FTP. I was not able to find any solution for this on
the internet, but following did the trick. (You could use the Public Instance IP (PIP), but then your Windows
Explorer clients will not be able to connect.)

You open a specific range of Passive FTP Ports on the first VM, and another specific range of ports on the
second server. This way FTP traffic will always be routed to the the right server.

To avoid a lot of manual work you can use PowerShell to open a range of ports:

Import-Module azure
Add-Azure Account
Select-AZURE Subscription “your subscription”
$vm = Get-AZURE -Service Name “yourvmservicename” -Name “yourvm”
For ($i = 6051; $i -le 6100; $i++)
{
$name = “FTP-Dynamic-” + $i
Write-Host -Fore Green “Adding: $name”
Add-Azure Endpoint -VM $vm -Name $name -Protocol “tcp” -PublicPort $i -LocalPort $i
}
# Update VM.
Write-Host -Fore Green “Updating VM…”
$vm | Update-Azure VM
Write-Host -Fore Green “Done.”
Now you can specify the machine specific range in IIS per machine, secondly you need to specify the public
IP of your cloud service in IIS. Note, deallocating both Virtual Machines will make you lose your Public IP.
(Since the latest Azure announcements it’s possible in Azure to reserve your IP).
Don’t forget to allow FTP through your Windows Firewall!

There are many related work in the literature about cloud gaming and the useful in
Many aspects. Not neglecting the important issue of user’s data on the cloud.

Note: Take printout of survey paper

2.5 Benefits:
Global Availability

One of the primary advantages of Microsoft Azure is that it is geography agnostic. In the modern business
world, most companies require that their data has global reach. Companies that rely on centralized
datacenters can sometimes find global expansion difficult. Features and adaptability of azure are not
contingent on location.
Last year, BP, the world’s 12th largest company by revenue, selected to move its workloads out of central
datacenters into Microsoft Azure as part of its corporate modernization efforts. BP will leverage Azure cloud
services platform as a service (PaaS) for data visualization, workflows and predictive analytics.

Security

One of the primary trepidations companies have when migrating to the cloud is security. On-premise
datacenters allow for direct control over security measures that comply with company standards. Security
remains one of the primary pillars of Azure. With the Azure Security Center, customers can maintain peace
of mind with access to 24/7 unified security management across hybrid cloud networks.
With this Azure feature, customers can gain visibility to security across both on premise and cloud networks.
Users can automatically deploy standardized security measures when onboarding new applications. Any
security threats can be immediately remediated either using built-in Azure security assessments or
proprietary assessments. Through VMs Azure controls leverage machine learning to protect against malware
and viruses and can detect and respond swiftly to attacks.
Azure positions itself as not only a secure cloud network but an added security layer for all company
datacenters.

Scalability

Scaling involves a system adapting to a change in the amount of workload or traffic in a given
application. This involves both increasing and decreasing application flows. One of the greatest
features of Azure is its flexibility to scale. Scaling cloud networks involves creating or assigning
servers to an application. With Azure, scaling is logistically simpler than with traditional hosting as
primary servers do not need to be taken down. Companies gain benefits of scaling in the cloud as
they don’t need to add physical infrastructure in order to scale.
Disaster Recovery

With cybersecurity and risk management remaining at the forefront of IT and business, disaster recovery is
key for cloud services. Azure prioritizes disaster recover through unprecedented coverage in regional and
global fail-over options, hot and cold standby models and rolling reboot capabilities. This level of disaster
recovery can be difficult to achieve in on premise models but comes standard out of the box with Azure.

Cost Savings and Flexible Expenditure

Cost savings continues to be one of the main drivers behind migrating to cloud services. Azure markets up
to 72% savings with pay-as-you-go pricing. Companies can also choose to go the hybrid route and maintain
their on premise data centers while reaping the benefits of flexibility and backup security in the Azure cloud.
Read about our partnership with leading supermarket company, Raley’s, and the cost savings realized by
migrating data to Azure. Microsoft Azure provided Raley’s with many advantages including, security,
scalability, flexibility, reliability, support, cost savings and the ability of Azure to provide a first class user
experience.

Compliance

Compliance can fall by the wayside when companies think about data. They would rather focus on tangible
business outcomes. However, compliance must work in lockstep with business and system development to
mitigate risk exposure and costly mistakes down the line! Microsoft holds more certifications than any other
cloud provider leading in the areas of security and privacy requirements including GDPR.

Development Focused Integrated Delivery Pipeline

With Azure, companies can focus on end-to-end solutions for development. Azure focuses more on the
entire package than just storage. With Azure, customers gain access to an integrated delivery pipeline for
sourcing, testing, delivery and go-live. Taking advantage of Azure’s delivery pipeline allows for greater
business continuity and better integration of data and workflows. It also fosters opportunities to create a
true DevOps culture.
Disadvantages of Microsoft Azure Cloud Test Labs
Despite all the benefits of running a test lab on Azure, there are some Hyper-V features that are not
supported on Azure, along with disadvantages that are associated with running test labs in a remote
datacenter.

1. Lack of Hyper-V Snapshot Support


For those that have used Hyper-V, VMware or similar virtualization solutions, it’s likely that you’re
used to taking snapshots of VMs and having the ability to roll back to a previous state. Despite
being based on Windows Server and Hyper-V technology, Azure doesn’t support snapshots.

2. Inability to Upload Custom Images


It’s easy to transfer images and files around a local area network (LAN) or use portable disks for the
same purpose. Once your lab is in the cloud, while Azure provides the option to upload images,
transferring large files is likely to be much slower than the speeds you could expect on your LAN.
Moreover, custom images must be prepared to run in VMs on Azure. You can’t attach the product
.ISO file to a VM and install an OS like in Hyper-V.

3. Provisioning Virtual Machines in the Cloud Takes Longer than On-


Premise
De-allocating VMs from the Azure fabric overnight or when they’re not being used is a good idea to
reduce costs. The downside is that provisioning VMs the next time you need them is more time-
consuming than starting an on premise Hyper-V VM. So expect longer wait times to get labs up and
running each morning.

4. Lack of Integrated Backup


Azure Backup is intended for backing up and restoring data located on your on-premises servers to
the cloud. That’s a great feature, but it’s not really useful for doing bare-metal restores of servers in
a remote datacenter. While there are various ways to back up servers using blob storage, the
process is somewhat convoluted and not supported by Microsoft as a backup solution for
production servers.

5. Poor Management GUI and Tools


In short, the current iteration of the Microsoft Azure Management Console is frustrating to work
with. It is slow to respond and update, and requires far too many clicks to achieve simple tasks,
sometimes needing you to click the back button to re-access features.
Microsoft currently has a new Azure portal in preview, but at the time of writing, it doesn’t support
the creation or management of virtual machines. I’ve tried several third-party tools, along with
Microsoft’s own Visual Studio IDE, but all provide just basic functionality and can’t replace the web-
based management portal.

6. No Access to Windows Client Images


One last peeve is that unless you have an MSDN subscription, there is no access to Windows 7 or 8
images in the gallery. You can upload your own custom images, but they are unsupported and
requires you to maintain them as you would in your own on-site lab.

Benefits of Test Labs in Microsoft Azure


I’ve written about Cloud Share and Azure in detail before, so here I want to briefly discuss the
benefits of testing in the cloud, specifically with Azure as my solution of choice.

Microsoft provides images, which get updated on a regular basis and can be used to provision
virtual machines (VMs). If you manage the status of VMs carefully, costs can be kept to a minimum.
As long as servers are specified correctly for the task, VMs run significantly faster than anything I
could provision in my own office lab.

Naturally the expandability of the cloud allows organizations to provision test labs quickly, without
having to worry about costs or logistics of providing staff with physical hardware, and the ability to
manage and provision VMs using PowerShell is a plus.
CHAPTER 3
GANTT CHART

Table 3.1: Gantt chart


CHAPTER 4
SOFTWARE REQUIREMENT SPECIFICATION

Secure FTP Server Windows 2016


Cloud Infrastructure Services

Secure FTP server solution using SSL/TLS. Fast deployment with secure access

Secure FTP Server Solution using SSL/TLS

Secure FTP Server is a full featured FTP server with support for secure SSL/TLS connections, IP security,
anti-FXP options, per-user speed limits, user groups and MODE-Z compression. It provides a plain, but easy
to use interface.

Secure FTP Server is a server that supports FTP and FTP over TLS which provides secure encrypted
connections to the server. This FTP VM image is deployed on Windows 2016 and is fully configured for quick
and easy deployment.

Secure FTP supports TLS, the same level of encryption supported by your web browser, to protect your data.
When using TLS your data is encrypted so that prying eyes cannot see it, and your confidential information is
protected. It also supports on-the-fly data compression, which can improve the transfer rates.

Features:

 Supports FTP, FTP over SSL/TLS (FTPS)


 Incorporating FileZilla® source code
 Supports resume and transfer of large files >4GB
 Compression with DEFLATE (MODE Z)
 Encryption with SSL/TLS (for FTPS)
 Per-user permissions on the underlying file system
 Active Directory integration LDAP
 Define User and Group permissions on the FTP Folder directories
 GUI configuration tool
 Speed limits
 Tabbed user interface
 Powerful Site Manager and transfer queue
 Bookmarks
 Drag & drop support
 Configurable transfer speed limits
 Filename filters
 Directory comparison
 Network configuration wizard
 Remote file editing
 Keep-alive
 HTTP/1.1, SOCKS5 and FTP-Proxy support
 Logging to file
 Synchronized directory browsing
 Remote file search
20
Disclaimer: FTP server solution is built using a modified version of FileZilla® server open source software.
This solution is provided under GPLv2 license. FileZilla® is a registered trademark of its respective owners.
No warrantee of any kind, expressed or implied, is included with this software - Use at your risk, responsibility
for damages (if any) to anyone resulting from the use of this software rest entirely with the user - The author &
trademark owners are not responsible for any damage that its use could cause.

Ports

The following ports are required to be opened if you are using an NSG or firewall appliance:

 21 (FTP)
 990 (FTPS)
 14147 (Optional - For FTP Server Admin Interface)
 50000 - 51000 (Passive Mode for data transfer)

Please read the configuration steps that you'll need to perform after the install at : How to setup FTP
server on Azure

Setup Secure FTP Server on Azure Server 2016

Once your Azure VM has been deployed there are some post configuration steps to complete to
start using this FTP Server

Login
Login using the credentials that were supplied during the VM creation

Launch FileZilla Server Instance


Launch the FileZilla server instance app, found on the desktop. On the launch screen press connect
as shown below (password is blank)

20
Passive Mode
You should now be connected. You may see connection errors and NAT errors, this is normal as we
need to complete some configuration. From the menu select

> Edit > Settings > Passive Mode Settings

You’ll need to set a passive mode port range. Usually (50000-51000). These ports are used for data
transfers to the server.

Set Public IP Address


For this next part you’ll need to make sure the VM has a public IP address to allow external clients
to connect as shown in yellow.

To attach a public IP address to your VM, follow Microsoft’s guide

Once you have a public IP address associated with the NIC on your azure VM, add the ip address to
the passive mode settings as shown below highlighted in yellow and also the passive port range:

Create Certificate (FTP over TLS)


The next step is to create a new private key and a self-signed certificate, needed by FileZilla server
to accept TLS connections.

Within the FileZilla server options, click on SSL/TLS settings. Check the Enable FTP over TLS
support (FTPS).

Next click on Generate New Certificate > Fill in your company information.

20
IMPORTANT – In the common name (Server address) field make sure to add the public DNS name
of your Azure VM. This can be found in the azure portal, as highlighted in yellow:

Save the key locally on the server and then press Generate certificate. No need to add a password.

Setup Users
There are 2 options:
1. Create local users and assign access
2. Integrate Active Directory and allow users to use their domain logins to authenticate

Option 1 (Local Users)


To setup local users and give access to directories locally on your server navigate to Edit > Users
Here you can add users and generate their passwords.
Then give the users access to your local folders you would like them to have access to. The VM has
a pre-configured folder on the C:\FTPDirectory that can be used or you can setup as many folders as
you like. Groups can also be setup and permissions applied at a group level.

20
Option 2 (Active Directory Integrated)
Open settings > LDAP and select enable LDAP support. Beta.

Add your private ip address of your local domain controller. Add port 389 and write the name of
your domain name.

Select Enable TLS/SSL

Next you need to add the users who need access to your FTP directories.

Select Edit > Users and here you’ll need to add the users full UPN that they use to logon to AD, for
example if their name is jsmith@yourdomain.com or your domain\smith we need to make sure we
add this so it matches their login UPN jsmith@yourdomain.com. We don’t need to add their

20
password here as it authenticates against Active Directory, so make sure the password checkbox is
unchecked.

Next check the boxes LOCAL and LDAP as in the screenshot below.

In the screenshot below I’ve added a test user from our AD called fuser and our AD domain is called
yourdomain.com

Next is to assign these users to your FTP directories they need access to. Click on Shared Folders
within the Users menu and add the local folders and assign permissions they need:

Now would be a good time to test if you can connect using an FTP client. If you can’t connect, try
the next step and to configure any NSG / Firewall rules.

20
Configure NSG Rules / Firewall Rules

If you have NSG’s or firewall appliances in Azure you will need to open access to the following
ports:

 Port: 21 (Used for FTP)


 Port: 990 (Used for FTPS)
 Port: 14147 (Used for FTP Server Administration)
 Passive Port Range: 50000 – 51000 (Used for data transfer)

Client FTP Software

To allow clients to connect, users can use any FTP client. You can use FileZillas FTP Client

Support

For issues regarding setup of this solution, leave a message in the comments below

If you would like to use our managed azure service and let us take care of managing your VMs, get
in contact with us

Disclaimer
This FTP server solution is built using a modified version of FileZilla server open source software.
This solution is provided under GPLv2 license. The respective trademarks mentioned in the offering
are owned by the respective companies. No warrantee of any kind, express or implied, is included
with this software
– Use at your risk, responsibility for damages (if any) to anyone resulting from the use of this
software rest entirely with the user
– The author is not responsible for any damage that its use could cause.

20
Plan + Pricing

The cost of running this product is a combination of the selected software plan charges plus the
Azure infrastructure costs for the virtual machines on which you will be running this software. Your
Azure infrastructure price might vary if you have enterprise agreements or other discounts.

To view pricing in a different currency the table is below.


Costs might vary by deployment region.

Software plan details

Secure FTP Server Windows 2016


Secure FTP server solution using SSL/TLS. Fast deployment with secure access.

Starting at
$0.036/hour

Pricing by virtual machine instance

Azure Web App: Connect to your site via FTP and


upload/download files
I will describe on how to connect to an Azure Web App via FTP using FileZilla as a FTP client. The
readers are free to choose their FTP client.

Download publish settings file from Azure Portal:


 Login to Azure portal: https://portal.azure.com
 Click on App Services.
 Select the Site and then click on Get publish profile.
20
 Save the file and open it in notepad.exe.

 The file contains 2 <publishProfile> sections. One is for Web Deploy and another for FTP.
 Under the <publishProfile> section for FTP make a note of the following values:
 publishUrl (hostname only)
 userName ————————–> This is the information you are looking for
 userPWD
Below is a publishsettings file from one of my test sites.
Every file has unique username and password. The user could also reset the password, however that is
beyond the scope of this post. I will discuss in another post altogether.

NOTE: We need only the hostname (waws-prod-db3-011.ftp.azurewebsites.windows.net) from the


FTP’s publishURL section and not the complete path.

Connect using FileZilla:

 Download and install FileZilla, Click here to download FileZilla.

 Launch FileZilla.

 Go to File Menu —>Site Manager.

 Under Site Manager click on New Site button and give it a descriptive name.

 Under the General tab set the values for the following accordingly
 Host: Paste the hostname from publishUrl obtained from the publishsettings file above.

20
 Logon Type: set this to Normal.

 User: Paste the userName obtained from the publishsettings file above.

 Password: Paste the userPWD obtained from the publishsettings file above.

 Click on Connect to connect to the site over FTP.

 You would see two folder under the root: Logfiles and Site.

20
 Log files folder as the name indicates provides storage for various logging options you see
under the CONFIGURE management page on the Azure Portal.

 Site folder is where the application resides. To be more specific the code resides
here: /site/wwwroot

 Thus, Azure Web Sites gives the user the flexibility to create/upload/download files/folder(s)
to their corresponding site via FTP.

Hardware Requirement:

4.1 Hardware Requirements


 Recommended system configuration:
 quad-core CPU
 4G bytes RAM
 20G bytes storage space
 graphicscardwithatleast1GbyteRAM

 Minimum system requirements:


 dual-coreCPUover2.4GHz
 2GbytesRAM

4.2 Software Requirements


 FileZilla Server 0.9.59
 Dreamweaver8
 Windowsserver2012

Deploy your app to Azure App Service using FTP/S


This theory shows you how to use FTP or FTPS to deploy your web app, mobile app backend, or API
app to Azure App Service.

The FTP/S endpoint for your app is already active. No configuration is necessary to enable FTP/S
deployment.

Open FTP dashboard

In the Azure portal, open your app's resource page.

To open the FTP dashboard, click Deployment Center > FTP > Dashboard.

20
Get FTP connection information
In the FTP dashboard, click Copy to copy the FTPS endpoint and app credentials.

It's recommended that you use App Credentials to deploy to your app because it's unique to each
app. However, if you click User Credentials, you can set user-level credentials that you can use for
FTP/S login to all App Service apps in your subscription.

20
Deploy files to Azure
1. From your FTP client (for example, Visual Studio, Cyberduck, or WinSCP), use the connection
information you gathered to connect to your app.
2. Copy your files and their respective directory structure to the /site/wwwroot directory in
Azure (or the /site/wwwroot/App_Data/Jobs/ directory for WebJobs).
3. Browse to your app's URL to verify the app is running properly.

Enforce FTPS
For enhanced security, you should allow FTP over SSL only. You can also disable both FTP and FTPS
if you don't use FTP deployment.

In your app's resource page in Azure portal, select App settings in the left navigation.

To disable unencrypted FTP, select FTPS Only. To disable both FTP and FTPS entirely, select Disable.
When finished, click Save. If using FTPS only you must enforce TLS 1.2 or higher by navigating to
the SSL settings blade of your web app. TLS 1.0 and 1.1 are not supported with FTPS Only.

Automate with scripts


For FTP deployment using Azure CLI, see Create a web app and deploy files with FTP (Azure CLI).

20
Create an App Service app and deploy files with FTP using Azure CLI
This sample script creates an app in App Service with its related resources, and then deploys a static
HTML page using FTP. For FTP upload, the script uses cURL as an example. You can use whatever
FTP tool to upload your files.

If you don't have an Azure subscription, create a free account before you begin.

Open Azure Cloud Shell


Azure Cloud Shell is a free, interactive shell that you can use to run the steps in this article.
Common Azure tools are preinstalled and configured in Cloud Shell for you to use with your
account. Select Copy to copy the code, paste it in Cloud Shell, and then press Enter to run it. There
are a few ways to open Cloud Shell:

If you choose to install and use the CLI locally, you need Azure CLI version 2.0 or later. To find the
version, run AZ --version. If you need to install or upgrade, see Install the Azure CLI.

20
Clean up deployment
After the sample script has been run, the following command can be used to remove the resource
group and all resources associated with it.

20
CHAPTER 5
SYSTEM DESIGN AND IMPLEMENT

5.1 Data Flow Diagrams

5.1.1 Level: 0 DFD:


Level0 DFD is the system in the highest level. The whole software is referred as
‘Game’ process with which player and developer interacts with. All the necessary
game data is stored in the game data repository. Player interaction involves
keyboard/mouse and chat message inputs, and display and chat message
outputs. Game continuously queries 3D model information, AI data, static
/dynamic room in formation and updates the necessary game data.

Fig 5.1.1 level 0

20
5.1.2 Level: 1 DFD “Game”:

Level 1 DFD in dictates the main functions of the game. Player provides the game with login info,
Which is authenticated and matched with the corresponding personal info of the player.
This information is then passed to the Client Game Engine. Input Handler is the layer where player
Inputs are processed and converted into action information that is evaluated in Client Game Engine,
And with the necessary room data from there positron, is combined to scene data. This is combined
By Graphics module with 3Dmodel data to construct the 3D scene to bed is played to the player.
The actions performed by the player is transformed into events and sent to Server Game Engine by
Client Game Engine, which also receives the incoming events from the Server GameEngine.Server
Game Engine is the central part where events from clients are processed and distributed, highly
using the Game Data Repository as well as the configuration data from Developer. There exists a
further process of Chat Handler, which handles the chat message traffic among the human and AI
players.

Fig 5.1.2 level 1.game

21
5.1.3 Level: 2 DFD “Server Game Engine”:

This L2 DFD introduces a detailed view of Server Game Engine, which is the core process of the
game on the server side. First of all, Channel Resolver process is responsible for handling the
incoming event request traffic by addressing their quest to the relevant channels, which may be
different for each client. Event Dealer is the part where event information is converted to actual
events and sent to Event Order, where the events are ordered with respect to time and address
constraints. Those ordered events, with the game data and room information from there pository,
are processed in Game Mechanics module. The resultant event information is sent back to Event
Dealer, and is eventually sent to the relevant clients. Game Mechanics process also interacts with
AIEngine, sending and receiving event data while AIEngine is capable of sending
messages to ChatHandler.Finally, configuration data from Developer comes directly to Game
Mechanics and processed.

Fig 5.1.3 Level2 Server Game Engine

22
5.1.4 Level: 2DFD“ClientGameEngine”:

This L2 DFD shows the inner dynamics of Client Game Engine, which is the core process of the
game on the client side. Initially, authentication decision is sent to Client Game Mechanics. From
the none, action info from the Input Handler is validated and sent to Client Game Mechanics.
Being changed, current room state is sent to State Screen Transformer, combined with static room
info from the repository and sent to Graphics. Events are handled similar to those in server side:
event requests are made by Client Game Mechanics to Server Game Engine, and the incoming
event information is received in Event Dealer. After being converter to events, they are sent to
Event Order and ordered. Game Mechanics receives and processes the ordered events.

Fig 5.1.4 Level 2 Client Game Engine

23
5.2 Use Case Diagrams:

The use case diagrams provided below go indirect correspondence with each of the functional requirements
item discussed in the Project Requirements section of there port. The reader is encouraged to revisit that
section, if necessary.

5.2.1 Game Play Use Case:


This use case displays the actions that the player can do in game play without interaction with other players.
We grouped these actions into four: Movement, Camera, Items and Menu Entrance. Player can walk, jump or
crouch in the game. Those movements can be executed forward, backward, left or right. Player can change
the camera view in the game.

Items is an important issue in the game. There are two types of items in the game: Wall Items are non-
moveable items and can only be used, Inventory Items are moveable. An inventory item can be get, dropped,
equipped or unequipped. Only one item can be equipped in a time and only the equipped item can be used.
The last action is the Menu entrance in the game, player can enter the menu whenever he wants.

Fig. 5.2.1 Game play Use case


5.2.2 AI Player Game Playing Use Case:

This use case displays the actions that AI player can do in the game.AI player in our game is a bit restricted, it can
only have movement actions which are walk, jump or crouch. Of course the directions are forward, backward, left
and right.

Fig 5.2.2 Game play Use case (AI Player)

5.2.3 Login Use Case:

This use case displays the player’s login to the game. Authentication server checks player’s user name and password
and according to the check it either allows or disallows player’s entrance to the game.

Fig 5.2.3 Login Use case


5.2.4 Menu Interface Use Case:

This use case displays the things that player can do in the Main Access Menu. He can view his profile, edit and accept
it. He can also view his elapsed time in the cube. Other functionalities are entering and leaving the game.

Fig 5.2.4 Menu Interface Use case

5.2.5 Chat Use Case:

This use case displays the chat usage in the game. Player can chat with other players in the game.
This chat can be public, which includes all players in a room, or can be private, which can only
Be done by two players. Player can also interact with AI players in the game, this interaction is of
Course limited.
Fig 5.2.5 Chat Use case
Microsoft Azure:

Introduction:
Microsoft Azure is Microsoft's cloud computing platform, providing a wide variety of
services you can use without purchasing and provisioning your own hardware. Azure
enables the rapid development of solutions and provides the resources to
accomplish tasks that may not be feasible in an on-premises environment. Azure's
compute, storage, network, and application services allow you to focus on building
great solutions without the need to worry about how the physical infrastructure is
assembled.

The Azure portal

An online management portal provides the easiest way to manage the resources you deploy into Azure. You can use
this to create virtual networks, set up Web Apps, create VMs, define storage accounts, and so on, as listed in the
previous section. As noted earlier in this chapter, there are currently two versions of the portal. The production
portal is the Azure portal at https://portal.azure.com. Most features have been moved to the Azure portal, with
some exceptions such as Azure AD. The previous portal is called the classic Azure portal
(https://manage.windowsazure.com), and it can still be used to manage Azure AD and to configure and scale classic
resources such as Cloud Services.

Deploy your app/game to Azure App Service using FTP/S

Introduction:
FileZilla is a free and open source FTP client for Windows, Mac and Linux. It is developed and maintained by
Tim Kosse and the FileZilla team. Development started in 2001 and it has evolved to become one of the most
popular FTP clients in use today.

FileZilla's feature list includes:

 Encryption of file transfers with Secure FTP and FTP-SSL.


 Manage files with cut and paste.
 Connecting without a password using ssh keys.
 Storing your passwords and site preferences in the Site Manager.
 Bookmarking of local and remote folders for easy folder navigation.
Of all the FTP client programs out there, FileZilla is far and away the one used by the most ExaVault users to
connect to their account.
FileZilla supports the following protocols for connecting to your ExaVault account:

 FTP
 SFTP
 FTP-SSL

Downloading and Installing FileZilla


If you don't already have FileZilla, then you'll want to get it downloaded and installed.
Installing FileZilla is pretty straight forward too. Installers are available for Windows, Mac, and Linux operating
systems.

Installation:

1. Download the desired edition of the FileZilla client. For use with ExaVault, the standard (free!) version of
the client will have all the features you need.
2. Double-click the downloaded file.
3. Follow the installation prompts. Use the default options for installation.

Go to this link. https://filezilla-project.org/


After that Download the FileZilla Client for all platforms.

Hit Download.
Now Run Your FileZilla setup and install. After installation you got this window in your system.

Host Your Web Apps on Azure via FileZilla

Login to your Azure Portal first


Go to WEB APPS, and select “NEW” button at the bottom of the window.

5.3 MS Azure website :

A web page (also written as webpage) is a document that is suitable to act as a web resource on the World Wide
Web. When accessed by a web browser it may be displayed as a web page on a monitor or mobile device.

FileZilla is a free software, cross-platform FTP application, consisting of FileZilla Client and FileZilla Server. Client
binaries are available for Windows, Linux, and mac OS, server binaries are available for Windows only. Both server
and client support FTP and FTPS (FTP over SSL/TLS), while the client can in addition connect to SFTP servers.
Windows Azure supports deploying websites from remote computers using Web Deploy, FTP, GIT or TFS. Many
development tools provide integrated support for publication using one or more of these methods and may only
require that you provide the necessary credentials, site URL and hostname or URL for your chosen deployment
method.
Credentials and deployment URLs for all enabled deployment methods are stored in the website's publish profile, a
file which can be downloaded in the Windows Azure (Preview) Management Portal from the Quick Start page or
the quick glance section of the Dashboard page.
If you prefer to deploy your website with a separate client application, high quality open source GIT and FTP clients
are available for download on the Internet for this purpose.

Prerequisites:

 Microsoft Azure - Management Portal.


 Azure - Create Web App
 Azure - Deployment credential
 Azure - Deployment center
 FileZilla – FileZilla Pro

Procedure:

Step 1: Login to Azure management portal

Step 2: Select App Services and click on Add button.

Step 3: Select web app service to create web page


Step 4: To create web app click to create button

Step 5: In the web app

 Write App name

 Subscription = pay as you go service

 Resource group = Create new

 Operating system = window

 Publish = code
Step 6: Then the main part of the web app is App service plan and Location to select the app service plan click on
create new.

Step 7: The benefit and costs of the app service plan varies between different tiers of web apps, which makes this
even more confusing. So before you start loading up your app service plans with multiple apps, make sure you are
aware of how your chosen plan implements this. At the present time there are 3 tiers in Azure Web Apps:

 Dev/test

 Production

 Isolated
Step 8: In the app service plan Set app service plan name, location, and pricing tier
We set the location=south India, and pricing tier=free in dev/test.

Step 9: Set the app service plan and location click on create button to create a web app.
Step 10: After creating a web app go to aur wen app and click the quickstart.

Step 11:-After that click on deployment credentials to set the username and password.

Step 12: In the deployment credentials set the username password then click ok
Step 13: Then go to azure portal in our web app service click on deployment center.

Step 14: In the deployment center choose FTP for connect the web page to FileZilla server.
Step 15:-In the FTP again set User credentials to connect the FileZilla using username and password.

Step 16: After that download the FileZilla app for the browser then install the app to connect the azure web app click
on file menu the select the site manager.
Step 17: In the site manager click on the new site then select the protocol are FTP.
Then click on host name which are provide by azure web app services. Then click on encryption which are set as only
use as plain ftp (insecure).then username and password. Then click connect.
Step 18: then click ok to connect.

Step 19: In the FileZilla to upload the web page code go to local site select the file.
Step 20: Then select the file right click on the file then click upload the file.

Step 21:-After uploading a new file then the old file name will be change to show the new web page.
Step 22:-Then go our azure portal and go to our web service the copy the url which are provide by the web app.
Step 23:-Then go to browser paste the URL and click enter the web page will be displayed on the browser.

CHAPTER 6
MATHEMATICAL ALGORITHM

Step 1: We create a Cloud gaming Website using the HTML language.


Step 2: We have create a windows server 2012 and app service.
Step 3: We download the FileZilla and install inside the system.
Step 4: We deploy the created websites using FileZilla.

Step 5: In FileZilla there are many options like Host Name, User name, Password and port
number.

Step 6: Filling the all the option which are given in azure portal at App service.

Step 7: The uploading the folder where all gaming related html pages stored.
Step 8: The connecting with webserver by given URL web port.
Step 9: The given URL open on the web browser.

Step 10: The user login to the URL for playing the game, after the login the user choose the
game as per choice.

Step 11: On the MS-Azure portal user request are monitor and manage using load balancer.
Fig 6.1: Internal flow of command

In the above module when user give the command over the internet or cloud, command
interpreter sends the command to the game server which is lies over the cloud. Game server
send the response to the user command and display game which are available on server, then
user need to choose the game which want to play. once user select the game, Game server send
the screen on user device, video will capture on user device, video encoder send the frames on
user device, and decodes on the web portal at user level.

CHAPTER 7
TEST SPECIFICATIONS

The general aim of testing is to affirm the quality of software systems by systematically
exercising the software in carefully controlled circumstances. Code-based testing (also
known as white box testing or structural testing) refers to the use of source code for
planning the test cases. Mostly developers perform Code-based testing. Specification-
based testing (or Black box testing) is a testing method in which the tester did not know
anything about the internal structure, design or implementation. Black box test cases are
derived from the design documents. Black box testing mainly tests the functionality of
the software.
TestcaseID 1
Testcase name Login to theAzurePortal
Testcase process RunWebapp
Test steps Browse the
URLSelect
thegamePre
Status Gamewillload
ssplayingopt
Finalresult Press playoption
ion
TestcaseID 2
Testcase name Login to the appservice
Testcase process Restart the service
Test steps No. ofrequestsendbyuser
Monitoringthe incoming request
inwebsiteHTTPservererroroccur
Status Reduced the errorbyload balancer
Finalresult Successfullyhandled theincomingrequestload

Table7.1: Test specification

CHAPTER 8
RESULT AND ANALYSIS

 The Cloud Gaming URL open on the web browser.


 After open websites the user choose the game as per his/her choice.
 After choosing the game, the game loads on website.
 Now the game is running on the websites and user enjoys the game.
 We analyze the http request as per user accessing.
 On the MS-Azure portal we monitor the user request and manage it using load balancer.

SNAPSHOTS FOR USER ACCESSING GAME:


CHAPTER 9
CONCLUSION AND FUTURE SCOPE

In this article, we grouped the existing gaming model new way. We study the new technique and try to reduce trouble
while playing game. We describe the design model of gaming API for better gaming experience with Mess-up of cloud
resource. We gave a brief history of cloud gaming services, followed by the design decisions made by representative
commercial cloud gaming services. Without these optimizations, service provider cannot consolidate enough cloud
gaming users to each physical machine. This in turn leads to much lower profits, and may drive the service provider
out of business. In summary, the advances of technologies turn playable cloud gaming services into reality; more
optimization techniques gradually make cloud gaming services profitable; hence, we believe that we are on the edge
of a new era of a whole new cloud gaming ecosystem, which will eventually leads to the next generation cloud gaming
services.

FUTURE SCOPE
With the growth in cloud technology, now the game is available via a cloud host or app, which is giving a benefit that
the game has crossed international borders without any import or export taxes, tariffs, or shipping fees. Online
gaming systems, which mix various multimedia such as image, video, audio, and graphics to enable players to
interact with each other over the Internet, are now widely used not just for entertainment, but also for socializing,
business, commerce, scientific experimentation, and many other practical purposes. Gaming is now a multi-billion
dollar industry all over the world.

We are trying to work on implementing the modules in cloud gaming such as Streaming Direct 3Dgames, Supports to
play on different GUI Games, Scheduling Handling, Processing performance of the server.
BetOnSoft
BetOnSoft
Posted: 06-03-2018
Online Gaming Firm Implements Real-Time Analytics and Scales for Planned Growth
BetOnSoft develops and manages more than 110 online casino games, played every day by thousands of players
worldwide. The company needed to ensure that its games are highly available, because players are online around
the clock. BetOnSoft also wanted to prepare for business growth by scaling its database while maintaining
application responsiveness. In addition, its applications must perform key business-critical analytics in real time. In
November 2011, the company deployed a hybrid application solution that takes advantage of the high-availability
features in Microsoft SQL Server 2012 Always On and the scalability of SQL Azure. Now, the company’s infrastructure
can exceed 10 times its previous peak loads while running intensive real-time data analytics. BetOnSoft has achieved
the availability it needs and can use its hybrid infrastructure to scale to meet unexpected business growth.

"SQL Server 2012 Enterprise with Always On gives us exactly the performance we need. We can exceed 10 times the previous
peak game-play load.... and still run intensive analytics in real time."

Thomas Pullen, Database Administrator, BetOnSoft

Business Needs
BetOnSoft, an international gaming software provider with presence in 11 countries, is a fast-growing developer of
popular online casino games used by players around the world. The company provides gaming software and
hardware infrastructure to independent operators that market and brand the games. Over the past two years, the
company has launched new operators into the marketplace and acquired existing operators from other software
providers through its superior platform and products.

Currently, BetOnSoft offers more than 110 single-player online games, including slot machines, roulette, blackjack,
and craps. These games can be played on computers or mobile devices.

As a growing player in the e-gaming software market, BetOnSoft needs to maintain high availability for its mission-
critical gaming applications in order to achieve business success. Their operators market to an international player
base, and so there are always players online, 24 hours a day, and seven days a week.

Availability was sometimes challenging because when BetOnSoft database administrators would run intensive
maintenance operations such as checking the database for corruption, application timeouts would often occur.
Additionally, when the company would deploy new software, administrators sometimes had to take the application
server offline.

To be competitive, BetOnSoft must be able to be agile and innovative in its technology approach, so it can handle rapid growth
in the number of users playing its games. In fact, as the number of operators using

BetOnSoft services increases, it is likely that aggressive marketing on any given day would create sudden high demand. To
handle such scenarios, BetOnSoft needs the ability to rapidly scale up or down. “We have more than doubled the number of
operators in the last 12 months,” says Thomas Pullen, Database Administrator, and BetOnSoft. “And our expectations are that
we will continue to grow. We needed to make sure that our database software and servers had the capacity to scale rapidly.”

BetOnSoft also sought to out-innovate its competitors by implementing rich functionality for operators and players
alike, much of which depends on complex data analysis to produce results in real-time.

To increase availability, scalability, and performance for its multi-terabyte database, in early 2011 BetOnSoft decided
to implement a new technology solution.

Solution
BetOnSoft began deploying a new solution in July 2011, when it upgraded its database servers and database
software. At that time, the company implemented Microsoft SQL Server 2008 R2 x64 Enterprise data management
software on two Dell PowerEdge R810 server computers with four 8-core processors and 256 megabytes of RAM.

Each server contains a 640-gigabyte and a 1.2-terabyte memory card made by Fusion-io, a storage-memory company
based in Salt Lake City, Utah. Fusion-io memory cards can improve processing capabilities in a data center by
relocating active data from centralized storage to the server where it is being processed. This can help reduce
latency while also increasing data-center efficiency.

In late 2011, BetOnSoft decided to upgrade further to Microsoft SQL Server 2012 Enterprise. “We had been very
happy with SQL Server 2008 R2 overall,” says Pullen. “But we saw features in SQL Server 2012 that we knew would
help us with availability, scalability, and performance.”

One of those features is SQL Server 2012 AlwaysOn, a new high-availability and disaster-recovery solution through
which customers can query data in replica databases and conduct backup operations from those replicas. AlwaysOn
includes availability groups that support a failover environment for a set of user databases that fail over collectively.
This feature also includes the AlwaysOn availability group listener, which contributes to easier application server
configuration and redundancy.

Additionally, AlwaysOn provides readable database mirror capabilities. The replica databases provide read-only access for use
in reporting and backup, which serves to offload some of the primary server’s workload.
“SQL Server 2012 AlwaysOn was the key driver for us,” Pullen says. “Between the availability groups, the readable mirror for
offloading reporting and database checking, and the listener, we knew we would increase our availability with SQL Server
2012.”

“Typical data architectures for e-commerce applications involve a high-throughput online transaction processing (OLTP)
database from which data is fed into a downstream data warehouse,” says Devan Go vender, Chief Software Architect,
BetOnSoft. “Data analysis is then usually run on the warehouse, which can be several seconds or even minutes behind. Even
small delays are not tolerable in the market-leading gaming products we are building.”

BetOnSoft architected its applications around SQL Server 2012 AlwaysOn, Fusion-io storage, and strategic hardware and
network configurations to take advantage of the benefits provided by this platform and achieve its product and performance
goals. Go vender says, “SQL Server 2012 AlwaysOn is a key part of our solution to achieve real-time results.”

Testing the Solution in the Lab

Prior to implementing SQL Server 2012, BetOnSoft held two series of testing engagements in Oxford, UK in April and May 2011.
Then, in September 2011, BetOnSoft worked with Microsoft to test SQL Server 2012 in a production environment at a
laboratory session at Microsoft headquarters in Redmond, Washington.

The goal of the Redmond lab was to validate that the technology could support at least 10 times current gameplay workload at
BetOnSoft, while still able to perform intensive data analytics in real-time. “We wanted to make sure that the technology could
give us an extra level of availability without any performance penalty to the players,” says Pullen. “And, ultimately, we wanted
to make sure that the solution supported future scale-up throughput requirements that fell within our acceptable application
response times.”

During the lab, BetOnSoft installed SQL Server 2012 instances on each server, activated the AlwaysOn features and set up
availability groups and synchronous secondary instances while activating reporting from a readable database mirror. It also
conducted failover testing. “We really wanted to run a stress test on the availability groups,” states Pullen. “We were driving
SQL Server 2012 to exceed 10 times our peak production load.”

BetOnSoft went live with the new solution in November 2011.

Creating a Hybrid Cloud Solution

BetOnSoft is also running several critical services on Microsoft SQL Azure, a cloud-based data-storage environment
that provides high availability by storing multiple database copies and providing fast provisioning.

For example, BetOnSoft maintains its error-reporting service in the Windows Azure cloud with data stored in SQL
Azure databases. This service monitors, by geographic region, the number of players worldwide that are
experiencing problems launching or playing games on their computers. “For some services, such as error reporting, it
makes sense to manage that outside the data center,” says Govender. “For example, there could be issues with the
data center that make it inaccessible for error reporting.”

BetOnSoft also runs certain marketing applications on Windows Azure, where demand can spike as a result of
campaigns run by marketing partners. “It was a no-brainer for us to run services that have unpredictable demand in
the cloud,” says Govender, “We scale up to meet demand and back down when demand subsides.”
Another Windows Azure service is used to collect statistics on the quality of connections to the company’s games.
Statistics are collected for download rates, latency, and number of connection errors.

BetOnSoft also has a Windows Azure monitoring service that collects data on transaction rates, the number of games played,
and other activity metrics in a SQL Azure database. It constantly analyses these metrics to detect and send alerts about any
anomalies that require attention.

Benefits
With the new SQL Server 2012 solution, BetOnSoft can process more than 10 times its previous peak workload while
running real-time data analysis. The solution also increases availability and gives BetOnSoft the capacity to scale for
growth. Additionally, the company has easier IT administration and can provide better service to its operators.

Processes 10 Times Previous Workload While Running Real-Time Analysis

During lab testing, BetOnSoft were able to exceed their target of 10 times their current production workload. “SQL Server 2012
Enterprise with AlwaysOn gives us exactly the performance we need,” says Pullen. “We can sustain more than 10 times the
current peak game-play load and still run intensive analytics in real-time.”

This performance is aided by the Fusion-io memory card, which contributes to low database latency because it does
not rely on SAN storage. “Using local attached storage helps BetOnSoft get the throughput it needs with SQL Server
2012,” says Josh Miner, Director of Product Marketing, Fusion-io. “With reduced latency, the server computers get
data faster and can process that data hundreds of times per millisecond. That contributes to faster and more
consistent response times for BetOnSoft game players.”

Increases Availability for Mission-Critical Applications

SQL Server 2012 AlwaysOn gives BetOnSoft the enterprise-level robustness it needs to ensure high availability for the
company’s mission-critical online gaming applications. “Before we upgraded our servers and implemented SQL Server 2012, I
could not regularly check the database. Whenever I did, it would cause application timeouts,” says Pullen. “Now, with the high
availability we get from SQL Server 2012 AlwaysOn, I can check the database every week, and I can be confident that the
database is corruption free.” BetOnSoft also checked automatic failover time during testing. “We were prepared to accept a
time of two minutes, and it only took 14 seconds, which was a huge win for us,” says Pullen.

Taking advantage of AlwaysOn availability groups, BetOnSoft can also deploy new game features faster than before. “With the
availability group listener, for example, multiple application servers can be configured identically, no matter where the
database is running,” says Pullen. “That further increases availability and helps us avoid downtime when we deploy new
software.”
The company also ensures high availability from its Windows Azure monitoring services. “We’re using SQL Azure for our core
monitoring services, and it helps us ensure the highest availability for our critical services,” says Govender.

SQL Azure also enhances security for those services. “The firewall and security configurations in SQL Azure are great,” says
Govender. “Our cloud-based services are now as secure as our data-center-based services.”

Provides Hybrid IT Structure to Accommodate Scalability

When BetOnSoft tested its new solution prior to going live, it confirmed that SQL Server 2012 could sustain the level of
throughput needed to meet future business growth. “We now have the capacity to add a lot more players and operators while
not losing any application responsiveness,” says Pullen. “If our customer base grows by 10 times, we know we’ll still have great
performance with SQL Server 2012 AlwaysOn.”

And with Windows Azure, BetOnSoft has an added layer that it can use to scale to handle unexpected high demand
for its services. With that capability, BetOnSoft can better compete in the online gaming marketplace. “In our
business, responsiveness and scalability are very important, because we need to retain the same, fast application
performance while more players are playing the games,” Pullen says. “We want to grow our business, and SQL
Server 2012 positions us to do that.”

Simplifies Administration

SQL Server 2012 features like AlwaysOn availability groups and the availability groups listener, which support easier
server configuration and failover management capabilities, will help simplify administration for BetOnSoft database
administrators. “Using the readable mirror in SQL Server 2012, I can check the database frequently and easily, as
well as offload reporting,” says Pullen. “That really reduces time and effort for me, making my job easier from a
management perspective.”

Enhances Service to Operators

In addition, SQL Server 2012 helped BetOnSoft enhance the services it provides to the operators that run the company’s games.
For instance, BetOnSoft is now able to provide fraud detection, VIP identification, and marketing campaign analysis services to
operators in real time. “Having the SQL Server 2012 readable database mirror functionality makes this easier for us to do,
because we can provide access to reporting data without compromising the primary server,” says Pullen.

With its SQL Azure–based services, the company can use a leaner infrastructure overall and is also gaining valuable
metrics that can be used to improve the user experience. “It’s very valuable for us to see that someone in the United
Kingdom is having a great download experience, while a player in India is having a bad one,” says Govender. “It gives
us a complete and detailed view of the global player experience.”
The company also can better detect issues and anomalies. “We use our SQL Azure–based monitoring service to see where the
problems are and if there are certain trends,” says Govender. “We use these metrics to enhance our services to improve the
overall user experience.”

Ultimately, SQL Server 2012 fulfilled all of the company’s requirements during testing, which confirmed that it was the right
technology to align with BetOnSoft business goals. “We would not have gone live with SQL Server 2012 if we hadn’t had that
success in the testing phase,” says Pullen. “Those results showed us that we were implementing the right technology to meet
our business growth and maintain the high availability and strong performance we need to be competitive in online gaming.”

Vous aimerez peut-être aussi