Académique Documents
Professionnel Documents
Culture Documents
Thesis
PROJECT REPORT
ON
Submitted By
Roshan Markande
Shubham Tankhiwale
Harish
Kamini Nimje
CERTIFICATE
This is to certify that there port entitled
Roshan Markande
Shubham Tankhiwale
Harish
Kamini Nimje
Completion of this mini project “Cloud Gaming on Azure Using FileZilla” has given
immense pleasure and knowledge to us.
We have no words to express our sincere thanks for valuable guidance extended
to me by our project guide in completion of project. Without this it was very
difficult task.
We would like to give our sincere thanks to Prof. Dilip Saini, Mentor of Post
Graduate Diploma in cloud Technology Department for necessary help and
providing us the required facilities. We would also like to express our gratitude to
Mr. Swadeep Gajbhiye, Project Guide and teacher who inspire us a lot.
We would also like to thank all faculty members and the staff who have directly
or indirectly helped us in completing the project.
Projectee:
Roshan Markande
Shubham Tankhiwale
Harish
Kamini Nimje
We hereby declare that project entitled “Cloud Gaming on Azure Using FileZilla”
original work and the Project has nor formed the basis for the award of any
Submitted By
Roshan Markande
Shubham Tankhiwale
Harish
Kamini Nimje
CONTENTS
Title Page…………………………………………………………………………..i
Certificate………………………………………………………………...........ii
Acknowledgement……………………………………………………………iii
Declaration……...……………………………………………………………….iv
Contents…………………………………………………………………..........vi
List of Figures…………………………………………………………………..vii
Abstract…………………………………………………………………...........ix
Chapter: 01 Synopsis…………………………………………………………01
Chapter: 02 Introduction
2.1 Overview………….……………………………………………….23
2.2 FileZilla (FTP/S) .......................................................24
2.3 Uses of FTP/S …………………………………………………28
2.4 Cloud Computing…………………………………………..31
2.5 Microsoft Azure
Everyone want play their favorite game in their device they have but in past time it was not
possible to everyone to play games on device they have , because configuration of the game and
graphics of the game does not support with the device so it cause in trouble. If user want to play
game first they should first installed game according to mobile, laptop and pc configuration.
Today technology like Microsoft Azure helps to publish and play any games without requirement
of graphics, device configuration. Microsoft Azure provide smart resources for gaming over
internet. We study the problem of optimally adopting ongoing cloud gaming session to maximize
the gamer experience in dynamic and challenging environments. The consider problem is quite
challenging because of the resource allocation require more space to optimization. Conduct
Microsoft azure cloud services over the live internet for online gaming to multiple devices is very
useful for all users. Enhancing the cloud gaming platform to provide better quality resources to
game users device to play high quality games with minimum resource requirement and low cost.
Keywords:
Cloud computing, cloud gaming, video coding, computer graphics, Remote gaming,
resource scheduling.
CHAPTER 1
INTRODUCTION
Everyone wants play their favorite game in their device they have. But in
past time it was not possible for everyone to play the games on their
device they have, because configuration of the game and graphics of the
game does not support for the devices it cause in trouble. If the user want
to play games they should first install games in mobile, laptop and pc
configuration. Today technology like virtualization and cloud helps to play
any games without requirement of cloud without graphics, device
configuration. Cloud gaming technology provide smart resources for
gaming over the internet.
Cloud Gaming:
Cloud gaming, sometimes called gaming on demand, is a type of online gaming. Currently there
Are two main types of cloud gaming: cloud gaming based on video streaming and cloud gaming based on
file streaming. Cloud gaming aims to provide end users frictionless and direct play-ability of games across
various devices.
Azure provides you with choice and flexibility to build your game backend on cloud. For compute, you can
use IaaS offerings like Virtual Machine, VM Scale sets on Windows and Linux or leverage PaaS offerings like
Service Fabric and App Service. For data storage, you can use managed database services like Azure SQL
Database and Azure Document DB and also MongoDB and other options from Azure Marketplace.
Keep players engaged
Enable multiplayer scenarios and leaderboards with Azure Active Directory. For instance, manage
social identity providers such as Facebook, Google, and Microsoft. And manage player retention
and increase user engagement and monetization across platforms using Azure Notification
Hubs and Azure Media Services.
Build a powerful end-to-end game analytics platform on Azure using tools from the Cortana Intelligence
Suite and big data solutions. Analyze mobile gamers’ behavior using services like Azure Machine
Learning and Azure Mobile Engagement to maximize app usage, user retention, and monetization.
Compute: Gaming services rely on a robust, reliable, and scalable compute platform. Azure customers can
choose from a range of compute- and memory-optimized Linux and Windows VMs to run their workloads,
services, and servers, including auto-scaling, micro services, and functions for modern, cloud-native games.
Data: The cloud is changing the way applications are designed, including how data is processed and stored.
Azure provides high availability, global data, and analytics solutions based on both relational databases as
well as big data solutions.
Networking: Azure operates one of the largest dedicated long-haul network infrastructures worldwide, with
over 70,000 miles of fiber and sub-sea cable, and over 130+ edge sites. Azure offers customizable
networking options to allow for fast, scalable, and secure network connectivity between customer premises
and global Azure regions.
Scalability: Azure offers nearly unlimited scalability. Given the cyclical usage patterns of many games, using
Azure enables organizations to rapidly increase and/or decrease the number of cores needed, while only
having to pay for the resources that are used.
Security: Azure offers a wide array of security tools and capabilities, to enable customers to secure their
platform, maintain privacy and controls, meet compliance requirements (including GDPR), and ensure
transparency.
Global presence: Azure has more regions globally than any other cloud provider, offering the scale needed
to bring games and data closer to users around the world, preserving data residency, and providing
comprehensive compliance and resiliency options for customers. Using Azure’s footprint, the cost, the time,
and the complexity of operating a game at global scale can be reduced.
Open: with Azure you can use the software you choose whether it be operating systems, engines, database
solutions, or open source – run it on Azure.
We’re also excited to bring FileZilla into the Azure family. Together, Azure and FileZilla are a powerful
combination for game developers. Azure brings reliability, global scale, and enterprise-level security, while
FileZilla provides Game Stack with managed game services, real-time analytics, and comprehensive LiveOps
capabilities.
We look forward to meeting many of you at GDC 2019 to learn about your ideas in gaming, discussing where
cloud and cloud-native technologies can enable your vision, and sharing more details on Azure for gaming.
Join us at the conference or contact our gaming industry team at azuregaming@microsoft.com.
Talks at GDC:
Thursday, March 21, 2019 at 11:30 AM: Best Practices for Building Resilient, Scalable, Game Services
in Microsoft Azure
Thursday, March 21, 2019 at 12:45 PM: Save Time for Creativity: Unlocking the Potential for Your
Game's Data with Microsoft Azure
Azure Gaming Reference Architectures: Landing Page
Multiplayer/Game Servers
Analytics
Leaderboards
Cognitive Services
GDC Booth demos for Azure:
AI Training with Containers – Use Azure and Kubernetes to power Unity ML Agents
Build NoSQL Data Platforms – Azure Cosmos DB: a globally distributed, massively scalable NoSQL
database service
Cross Realms with SQL – Build powerful databases with Azure SQL
Get Started with Azure for Gaming
Modern games require more powerful development tools, global and flexible multiplayer support,
and new revenue models. But you’re here to build worlds, not back ends. Let Azure manage your
platform so you can focus on making games that make headlines.
Some advantages of building your own game services and backend from scratch include:
Finer control over the backend services and data that are running your game.
Creation of a custom solution or features to run your game that existing services do not
provide.
Optimizing costs by paying for only what you use each month.
Getting Started
o Create your Azure free account
o General Guidelines
Reference Architectures
o Multiplayer / Game Servers
o Analytics
o Leaderboards
o Cognitive Services
Reference Architectures
The reference architectures will help start you on the path of building gaming services for your
game. Each architecture is composed of:
A highly abstract diagram showing the different pieces and how they interact with each
other.
A deployment template and installation steps, to help you get started quickly with the
implementation of the architecture.
A list of considerations to give you a sense of the scope of requirements covered in the
architecture.
In most cases, a sample project, so you can quickly test the deployed infrastructure in your
own Azure account.
A high-level step by step guide and implementation details to help you understand the
sample.
1. Level of management - From putting all the effort on yourself to letting the platform take
care of everything
2. Operating system - Windows or Linux
3. Where does the game session run - Dedicated server or peer-to-peer (P2P)
4. Hardware - What does the game session need to be able to run
5. Processing time - Real-time or non-real time (NRT)
6. Latency - Players will be at a disadvantage if they have lag or they won't
7. Persistence - The game world continues to exist and develop internally even when there are
no players interacting with it, or each session has it's own beginning and end
8. Number of concurrent players - Small, medium or large
9. Reconnection allowed - If a player or players get disconnected, can they go back to the
game or they have to start a new game session
Following are some multiplayer backend use cases for you to explore:
Multiplayer matchmaker
Synchronous multiplayer
Asynchronous multiplayer
This reference architecture details the steps to setup a basic Azure backend that will host a game server on
either Windows or Linux, using Minecraft server as an example.
Architecture diagram:
Azure regions/datacenters:
Azure has more global reason than aby other cloud provider-offering the scale needed to bring
applications closer to user around the world, preserving data residency, and offering
comprehensive compliance and resiliency options for customer.
Regions
A region is a set of datacenters deployed within a latency-defined perimeter and connected
through a dedicated regional low-latency network.
With more global regions than any other cloud provider, Azure gives customers the flexibility to
deploy applications where they need to. Azure is generally available in 44 regions around the
world, with plans announced for 10 additional regions.
Geographies
A geography is a discrete market, typically containing two or more regions, that preserves data
residency and compliance boundaries.
Geographies allow customers with specific data-residency and compliance needs to keep their data
and applications close. Geographies are fault-tolerant to withstand complete region failure through
their connection to our dedicated high-capacity networking infrastructure.
Availability Zones
Availability Zones are physically separate locations within an Azure region. Each Availability Zone is
made up of one or more datacenters equipped with independent power, cooling, and networking.
Availability Zones allow customers to run mission-critical applications with high availability and low-
latency replication.
Cloud gaming, sometimes called gaming on demand, is a type of online gaming. Currently there are two
main types of cloud gaming: cloud gaming based on video streaming and cloud gaming based on file
streaming. Cloud gaming aims to provide end users frictionless and direct play-ability of games across
various devices.
This solution provides an overview of common components and design patterns used to host game
infrastructure on cloud platforms.
Video games have evolved over the last several decades into a thriving entertainment business. With the
broadband Internet becoming widespread, one of the key factors in the growth of games has been online
play.
Online play comes in several forms, such as session-based multiplayer matches, massively multiplayer virtual
worlds, and intertwined single-player experiences.
In the past, games using a client-server model required the purchase and maintenance of dedicated on-
premises or co-located servers to run the online infrastructure, something only large studios and
Publishers could afford. In addition, extensive projections and capacity planning were required to meet
customer demand without overspending on fixed hardware. With today's cloud-based compute resources,
game developers and publishers of any size can request and receive any resources on demand, helping to
avoid costly up-front monetary outlays and the dangers of over or under provisioning hardware.
Extensibility:
Gaming anywhere adopts a modularized design. For example, platform-dependent components such as audio
and video capturing and platform independent components such as code cs and network protocols can be
easily modified or replaced.
Portability:
Gamers may use devices with heterogeneous architectures to access Gaming Anywhere, and thus it is
important to port Gaming Anywhere to as many platforms as possible. New platforms can be easily supported
by replacing the platform-dependent components in Gaming Anywhere.
Configurability:
System researchers may conduct experiments with diverse system parameters. A large number of built-in
audio and video code cs are supported by Gaming Anywhere.
In addition, Gaming Anywhere exposes all available configurations to gamers so that it is possible to try out
the best combinations of parameters by simply it in text-based configuration file for various usage scenarios.
CHAPTER 2
PROBLEM DEFINATION AND SCOPE
2. Game-Data Processing:
The massive number of players in the virtual world leads to massive amounts of
data: user interactions, uploaded screenshots and videos, social networking, etc.
Analyzing these data can help the system designers to understand player behavior
and to gain insight into system operation, thus allowing them to build better
games for the players. Cloud-based data processing could enable an elastic, and
thus efficient, platform for time-based and graph analytics.
3. Game-Content Generation:
Game content, from bits such as textures to abstract puzzles and even entire
game designs, is at the core of the entertainment value of games. Until the early
2000s, manual labor has ensure that the quality and quantity of game content
matched the demands of the playing community, but is not scalable due to
exponential growth in number of users and production costs. A cloud-based game
platform providing elastic, cost-effective, procedural generation of player
customized content, could lead to a vast improvement over the capabilities of
today’s generation of games.
Front-end
Back-end
It is the responsibility of the back-end to provide the security of data for cloud users along with the traffic
control mechanism. The server also provides the middleware which helps to connect devices &
communicate with each other.
Businesses used cloud infrastructures to work with these applications. Unlike subscription-based models of
pricing, payment structure of the cloud enables the user to subscribe to vendor services & cloud
infrastructures are paid on a 'pay-per-use' basis.
The cloud technology architecture also consists of front-end platforms (as read in the early chapters) called
the cloud client which comprises servers, thin & fat client, tablets & mobile devices. The interaction is done
through middleware or via web-browser or virtual sessions. According to Jason Bloomberg of Zap Think, the
cloud-oriented architecture can essentially be the building block of IoT (Internet of Things) in which anything
can be connected to the internet. The cloud architecture is a combination of both services oriented
architecture & event-driven architecture. SO cloud architecture encompasses all elements of the cloud
environment.
Cloud Infrastructure:
The cloud technology also has a specific type of infrastructure that allows it to give so much advantage to its
users. The cloud computing as a whole is a combination of different hardware & software that makes the
working of cloud technology utterly wonderful.
It refers to the software along with the hardware components such as storage drive, hardware, servers,
virtual software, other cloud management software, and other networking devices; all work together to
support the computing requirement of the cloud computing model. Moreover, the cloud technology holds a
software abstraction layer that virtualizes the cloud resource & presents them to users locally.
Cloud Infrastructure Management Interface (CIMI) is an open standard API that is used to manage the cloud
infrastructure. It enables its users to handle all the cloud infrastructure easily by providing a means to
interact with the provider & their consumer or developer.
The hypervisor can be defined as the firmware (a permanent set of instruction or code programmed into the
read-only memory & is a low-level program) that acts as a manager for the virtual machine. It is also called
Virtual Machine Monitor (VMM) which creates & runs the virtual machine. It provides the guest OS with a
virtual operating platform to manages the execution of other applications. There are two types of the
hypervisor.
The limitations of cloud technology concerning infrastructure are:
Scalability
Intelligent Monitoring
Security
To address the three main requirements, we propose the architecture of a cloud-based platform depicted in
Figure 1. The architecture is based on three pillars: virtual world management, game-data processing, and
game-content generation. Responding to Requirement 1, the virtual-world management pillar addresses
game hosting and the management of players in the virtual world. The game-data processing pillar
addresses Requirement2: it analyzes time-based and graph data corresponding top layer sand their games.
Focusing on Requirement3, the content generates player-customized content at massive scale. The virtual-
world management pillar provides in-game data to the game-data processing pillar and uses content
produced by the game-content generation pillar. We describe the challenges and opportunities of the
virtual-world management pillar inSection3, and focus on the other two pillars in the remainder of this
section. We focus on systems challenges; others, such as finding new ways to use data and to generate
content, fall outside the scope of this work.
Fig.2.1 – A generic cloud-based platform for massivizing online games. The three pillars are virtual-world
management, gamed at a processing, and game-content generation
2.2.3: CHALLENGES AND OPPORTUNITIES IN MICROSOFT CLOUD
MANAGEMENT
Overcoming the Challenges of Microsoft Azure:
This is an interesting time for organizations that are heavily invested in Microsoft
technologies.
If you haven’t yet moved to the cloud, you’re probably seriously considering
Azure. And if you have already moved to the Azure cloud, you may be grappling
with some very real management and resource challenges.
With help from leading industry analysts, we’ve compiled a wealth of data and
found some interesting insights into Azure. We took that information and built an
easy-to-read infographic, aptly named “Overcoming the Challenges of Microsoft
Azure.” You can check it out at the bottom of this post.
The first finding: you’re in good company. We found that 65 percent of IT decision
makers are seriously considering Azure as their public cloud platform. And most of
those considering Azure share concerns around three key areas: maintenance,
security and the IT skill set required to run Azure.
But the biggest challenges still stem from balancing a high volume of Azure
maintenance needs against finite IT resources. From architecture design to
patching and database administration, there are about 14 different administrative
areas that need frequent attention. That translates into budget dollars that aren’t
going into innovation or new technology investments. What’s more, if you’ve
been searching for qualified Azure specialists, you know they’re increasingly hard
to find.
We hope you’ll check out the infographic below. You may find it especially helpful
if you’re building a business case for outsourcing Azure management, or if you
need to identify the key factors to weigh while building an in-house team (such as
the cost of Azure-related expertise).
This infographic is the first of a series of blog posts we’re developing that will dive
deeper into the four key challenges highlighted in the infographic:
See that we have only spend 2% of our 50 GB available disk space. I tried restarting the webserver, but that
did not do the trick. I see some people mention some sort of .temp file, which could possibly take some disk
space, however, I can't seem to locate such file anywhere in the FTP directory. I am short on ideas, so any
help would be appreciated.
Productive
Reduce marketing cycles by delivering features faster with more than 100 end-to-end services.
Go beyond connecting your datacenter to the cloud. Take advantage of the broadest set of hybrid
capabilities and deliver true hybrid consistency in your applications, data, identity, security and management
across on-premises and cloud environments.
Create a truly consistent experience across your hybrid cloud using comprehensive Azure cloud capabilities.
Reduce complexity and risk with the platform, tools and services designed to work together across your on-
premises and cloud environments. Build and deploy your applications consistently, seamlessly manage data,
enable anywhere access with single sign-on and deliver integrated security and management across on-
premises and the cloud.
Develop breakthrough apps with built-in intelligence
Take advantage of a comprehensive set of services, infrastructure and tools to build AI-powered
experiences. Build bots that naturally interact with users and built-in advanced analytics tools to make faster
predictions.
Use a rich set of Azure data and AI services such as Azure Data bricks, Azure Cosmos DB, Azure Cognitive
Services and Azure Bot Service to enable new experiences in your apps for human-like intelligence.
Rely on managed service capabilities such as built-in monitoring, threat detection, automatic patching and
backups.
Help ensure you are compliant with industry-specific standards by taking advantage of the certifications
offered by the Microsoft Cloud—the platform with the most comprehensive compliance portfolio of any
cloud provider. Simplify your organization’s compliance with offerings that provide built-in controls,
configuration management tools and implementation and guidance resources. Use third-party audit reports
to verify that your cloud assets adhere to the strict security controls that industry and government standards
mandate
ASOS, a leading online fashion retailer, transformed its platform from a monolithic, on-premises e-
commerce system to a micro services platform running on Microsoft Azure. In 2016, the new platform
handled more than double the volume of Black Friday orders from the previous year. The rapidly growing
company with 15.4 million customers has also accelerated development of innovative mobile apps and
features to quickly target new markets and stay on top of consumer and technology trends.
Onboarding developers in hours instead of week…
Daimler AG, one of the world’s largest manufacturers of premium cars and trucks, is driving hard to be a key
player in software. To speed up software development and thus innovation, Daimler uses Microsoft Azure
DevTest Labs. By developing in Azure, the company can onboard developers in hours versus weeks, get new
ideas underway faster, and attract top talent with a state-of-the-art development environment. Peter
Rothlaender, Manager of Cloud Solutions at Daimler, explains why Daimler is honking its horn about Azure
DevTest Labs.
Application
If you choose to install Kentico to Microsoft Azure, all files will be grouped into a solution based on Visual
Studio's Microsoft Azure template. The solution contains several projects. One of them is a web application,
which encompasses almost all the functions of Kentico and is designed to run as an Azure ASP.NET Web
role.
The Smart search worker, is separated from the web application in another project because it cannot run
together with the application as the Web role. To index content of websites correctly and effectively, the
Smart search worker runs as an Azure Worker role.
Because the application is divided into these two services, you also need to configure them separately.
See Configuring an Azure project.
On this page
Application
Database
File storage
Multiple web role instances
Storing session state
Alternative approaches
o Microsoft Azure Web Sites or Virtual Machines
Related pages
Requirements and limitations for running Kentico on Microsoft Azure
Microsoft Azure project development lifecycle
Working with physical files using the API
Database
Kentico on Microsoft Azure uses an Azure SQL relational database. This database engine is almost identical
to the standard SQL Server engine, with only a few limitations. These limitations are taken into account in
Kentico, and no additional configuration or customization is required. If you're interested in which SQL
Server features are not available in Azure SQL, refer to SQL Server Feature Limitations (Azure SQL
Database) on MSDN.
File storage
Microsoft Azure does not offer a persistent file system similar to the file systems in Windows that you are
used to. Data stored within Azure cannot be hierarchically organized into folders. However, Kentico provides
an abstract layer called CMS.IO, which enables the system to operate on different types of file storages.
See Working with physical files using the API for more information.
The CMS.IO namespace acts as an intermediary between the Kentico business layer and various file
storages, including Azure blob storage. On a standard non-Azure installation, CMS.IO only overrides the
System.IO namespace. On Microsoft Azure, the namespace uses a provider which works with the blob
storage, creating an imitation of the regular Windows file system. The CMS.IO namespace can be extended
to support any other type of storage, e.g. the Amazon cloud drive.
Additionally, you can make use of the Azure storage provider and store files in the cloud even if you're
running a non-Azure installation. You can find more information about this approach in Configuring Azure
storage.
The file storage is shared across multiple web role instances, therefore, no file synchronization is needed.
In Microsoft Azure SQL Database - easy to set up, suitable for small projects or projects with mostly read
access to web pages – see Storing session state information in Azure SQL database.
In Microsoft Azure Cache - more suitable for larger projects than the Azure SQL database option, but
requires configuration of Azure Cache – see storing session state information in Azure Cache Service.
Alternative approaches
This section describes the setup and deployment of Kentico to an Azure Cloud Service, as this service is the
best option for most projects. However, there are other alternative environments, in which Kentico is
supported as well.
When you develop projects on Microsoft Azure, in typical cases you want to begin with a small size of the
project, which uses the least resources possible. Then, as your project grows, you configure your project to
utilize more resources to accommodate the performance and size requirements of the project. This topic
presents main levels of development on Microsoft Azure and provides links to related configuration tasks,
which you need to perform when ascending to a higher level.
Level 1 - Development
For the duration of project development, one instance of CMSApp web role is usually enough. To configure
a project to use one web role instance, perform the basic configuration tasks.
Level 3 - Performance
When the performance of the level 2 environment is not sufficient, you can configure the Azure Cache
Service to store session state information – see Storing session state information in Azure Cache Service.
Level 4 - Scalability
When you need even more power, you can further scale your project using the following approaches:
On this page
Level 0 - Local development
Level 1 - Development
Level 2 - Production with SLA
Level 3 - Performance
Level 4 - Scalability
Related pages
Developing Azure projects locally
Prerequisites:
To learn Windows Azure, you need to be familiar with the Windows environment and have a basic
knowledge of cloud computing. To publish a webpage in Azure you have to be a basic knowledge of Visual
Studio and must have to know about how to make a webpage in Visual Studio 2019.
Procedure:
Steps:
3) After that configure your new project by setting project name and location and hit create.
4) Select Web Application (Model-View-Controller) and hit create.
6) Now to run your website get IIS Express Certificate and Run IIS Express.
Your Webpage shown like this.
7) Now start to publish.
9) Create App Services for that login with your Azure Portal and hit create.
10) Now Publish your Web Site.
11) Go to your Visual Studio 2019 project open your project and make some changes now.
For that go to your “index.cshtml” page and change your code like this.
12) Go to Solution Explorer write click your project and hit publish.
Create, monitor, and manage FTP files by using Azure Logic Apps
With Azure Logic Apps and the FTP connector, you can create automated tasks and workflows that create,
monitor, send, and receive files through your account on an FTP server, along with other actions, for
example:
You can use triggers that get responses from your FTP server and make the output available to other actions.
You can use run actions in your logic apps for managing files on your FTP server. You can also have other
actions use the output from FTP actions. For example, if you regularly get files from your FTP server, you can
send email about those files and their content by using the Office 365 Outlook connector or Outlook.com
connector. If you're new to logic apps, review What is Azure Logic Apps?
Limits
FTP actions support only files that are 50 MB or smaller unless you use message chunking, which let
you exceed this limit. Currently, FTP triggers don't support chunking.
The FTP connector supports only explicit FTP over SSL (FTPS) and isn't compatible with implicit FTPS.
Prerequisites
An Azure subscription. If you don't have an Azure subscription, sign up for a free Azure account.
The FTP connector requires that your FTP server is accessible from the internet and set up to operate
in passive mode. Your credentials let your logic app create a connection and access your FTP account.
Connect to FTP
Before your logic app can access any service, you must create a connection between your logic app and that
service. If you didn't previously create this connection, you're prompted for connection information when
you add a trigger or action for that service to your logic app. The Logic Apps Designer provides an easy way
for you to create this connection directly from your logic app.
1. Sign in to the Azure portal, and open your logic app in Logic App Designer, if not open already.
2. For blank logic apps, in the search box, enter "ftp" as your filter. Under the triggers list, select the
trigger you want. -or-
For existing logic apps, under the last step where you want to add an action, choose new step, and
then select Add an action. In the search box, enter "ftp" as your filter. Under the actions list, select
the action you want.
To add an action between steps, move your pointer over the arrow between steps. Choose the plus
sign (+) that appears, and then select add an action.
3. Provide the necessary details for your connection, and then choose Create.
4. Provide the necessary details for your selected trigger or action and continue building your logic app's
workflow.
When requesting file content, the trigger doesn't get files larger than 50 MB. To get files larger than 50 MB,
follow this pattern:
Use a trigger that returns file properties, such as when a file is added or modified (properties only).
Follow the trigger with an action that reads the complete file, such as Get file content using path, and
have the action use message chunking.
Examples
This trigger starts a logic app workflow when the trigger detects when a file is added or changed on an FTP
server. So for example, you can add a condition that checks the file's content and decides whether to get
that content, based on whether that content meets a specified condition. Finally, you can add an action that
gets the file's content, and put that content in a folder on the SFTP server.
Enterprise example: You can use this trigger to monitor an FTP folder for new files that describe customer
orders. You can then use an FTP action such as Get file content, so you can get the order's contents for
further processing and store that order in an orders database.
When requesting file content, triggers can't get files larger than 50 MB. To get files larger than 50 MB, follow
this pattern:
Use a trigger that returns file properties, such as when a file is added or modified
(properties only).
Follow the trigger with an action that reads the complete file, such as Get file content using
path, and have the action use message chunking.
A valid and functional logic app requires a trigger and at least one action. So make sure you add an action
after you add a trigger.
Here is an example that shows this trigger: When a file is added or modified
1. Sign in to the Azure portal, and open your logic app in Logic App Designer, if not open already.
2. For blank logic apps, in the search box, enter "ftp" as your filter. Under the triggers list, select this
trigger: When a filed is added or modified - FTP
3. Provide the necessary details for your connection, and then choose Create.
4. By default, this connector transfers files in text format. To transfer files in binary format, for example,
where and when encoding is used, select Binary Transport.
5. Next to the Folder box, choose the folder icon so a list appears. To find the folder you want to monitor
for new or edited files, select the right angle arrow (>), browse to that folder, and then select the
folder.
Now that your logic app has a trigger, add the actions you want to run when your logic app finds a new or
edited file. For this example, you can add an FTP action that gets the new or updated content.
When requesting file content, triggers can't get files larger than 50 MB. To get files larger than 50 MB, follow
this pattern:
Use a trigger that returns file properties, such as When a file is added or modified (properties only).
Follow the trigger with an action that reads the complete file, such as Get file content using path, and
have the action use message chunking.
5. Save your logic app. To test your workflow, add a file to the FTP folder that your logic app now
monitors.
2.4 LITERATURE SURVEY
FileZilla®, the popular open source file transfer product targeted at administrators, developers,
engineers and power users, today announced FileZilla Pro native support for Azure Files and Azure
Blob Storage.
The FileZilla product roadmap is continuously expanding beyond FTP-like protocols to allow Web
designers, editors, and Webmasters to easily access even more cloud services.
Through new protocol support, users can now access and transfer their files in even more scenarios
through FileZilla’s reliable, powerful user interface. The addition of Microsoft Azure Files and Azure
Blob storage offer FileZilla user’s ways to store their data in the cloud:
Azure Blobs provides client libraries and a REST interface that allows unstructured data to be stored
and accessed at a massive scale in block blobs.
Azure Files provides an SMB interface, client libraries, and a REST interface that allows access from
anywhere to stored files.
For in-depth description of the Azure storage protocols and the scenarios for when to use them,
visit https://docs.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
FileZilla Pro allows users to easily share files across all types of and remote servers and computing
environments. In addition to Azure, FileZilla supported formats include Amazon S3, S3-compliant
cloud services, WebDAV among others. Future releases will include support for OpenStack, Google
Cloud Storage, and more.
Availability
FileZilla Pro can be downloaded from https://filezillapro.com, with Mac version available from the Apple App
Store https://itunes.apple.com/app/filezilla-pro/id1298486723 other versions are available from the FileZilla
website https://filezilla-project.org/
© FileZilla. All rights reserved. FileZilla and the FileZilla logo are registered trademarks in the USA and the
European Union. All other brands and trademarks are the property of their respective owners.
CONTACT:
Roberto Glopping
+1 315-544-0504
social@filezilla-project.org
Creating a High Available FTP Server on Microsoft Azure
About a year ago, at the announcement of the general Availability of Infrastructure as a Service I tested a
single box FTP server using FileZilla Server. The main benefit of FileZilla is it’s easy installation and
configuration.
Single Box configuration:
Following posts will guide you while creating a single box server using FileZilla:
http://digitalmindignition.wordpress.com/2012/11/28/azure-vm-role-with-filezilla-as-ftp-server
http://huberyy.wordpress.com/2012/08/03/set-up-a-ftp-server-in-windows-azure-virtual-machine-with-
filezilla-no-coding
Azure Virtual machines need maintenance from time to time, you should always avoid a single point of
failure, Enough reasons for a High Available configuration…This post does not consists of a step by step
guide showing you how to create a VNET, VM’s, DFS…
You need some experience with the Azure platform and Windows Server, I will help you putting the pieces
together for a High Available FTP Server running on the Azure platform.
High Available Topology:
When creating the first VM, create an availability set. When creating the second VM, join the availability set
you just created.
Use the same cloud service name for the second VM as the one you defined at creation time of the first VM.
When creating a new VM, the first thing you should do is changing the Windows Update and UAC settings.
Attach an empty data disk to both Virtual Machines and format it. (Will be used for DFS and FTP file storage)
Active Directory:
Install AD on both virtual machines. (http://azure.microsoft.com/en-us/documentation/articles/active-
directory-new-forest-virtual-machine)
Create an admin user in the domain for future usage. (Domain admin).
Create a DFS share and set up Shared IIS config (you can use a shared config when doing the initial setup,
when you go live you will need to disable it due to the port settings).
You will find all the information to do this on following sites:
http://windowsazureguide.net/2013/09/29/how-to-create-highly-available-load-balanced-ftp-server-on-
windows-azure (This post is not complete, but contains useful information)
http://www.iis.net/learn/manage/managing-your-configuration-settings/shared-configuration_264
http://blogs.msdn.com/b/wats/archive/2013/12/13/setting-up-a-passive-ftp-server-in-windows-azure-
vm.aspx
FTP users and folders:
Should you have problems remembering how to configure user access in IIS, following posts will guide you?
If you work with a domain instead of for example local users, you need to create a folder with the domain
Name in IIS, don’t forget this!
https://community.rackspace.com/products/f/25/t/491
http://technet.microsoft.com/en-us/library/jj134201.aspx
You open a specific range of Passive FTP Ports on the first VM, and another specific range of ports on the
second server. This way FTP traffic will always be routed to the the right server.
To avoid a lot of manual work you can use PowerShell to open a range of ports:
Import-Module azure
Add-Azure Account
Select-AZURE Subscription “your subscription”
$vm = Get-AZURE -Service Name “yourvmservicename” -Name “yourvm”
For ($i = 6051; $i -le 6100; $i++)
{
$name = “FTP-Dynamic-” + $i
Write-Host -Fore Green “Adding: $name”
Add-Azure Endpoint -VM $vm -Name $name -Protocol “tcp” -PublicPort $i -LocalPort $i
}
# Update VM.
Write-Host -Fore Green “Updating VM…”
$vm | Update-Azure VM
Write-Host -Fore Green “Done.”
Now you can specify the machine specific range in IIS per machine, secondly you need to specify the public
IP of your cloud service in IIS. Note, deallocating both Virtual Machines will make you lose your Public IP.
(Since the latest Azure announcements it’s possible in Azure to reserve your IP).
Don’t forget to allow FTP through your Windows Firewall!
There are many related work in the literature about cloud gaming and the useful in
Many aspects. Not neglecting the important issue of user’s data on the cloud.
2.5 Benefits:
Global Availability
One of the primary advantages of Microsoft Azure is that it is geography agnostic. In the modern business
world, most companies require that their data has global reach. Companies that rely on centralized
datacenters can sometimes find global expansion difficult. Features and adaptability of azure are not
contingent on location.
Last year, BP, the world’s 12th largest company by revenue, selected to move its workloads out of central
datacenters into Microsoft Azure as part of its corporate modernization efforts. BP will leverage Azure cloud
services platform as a service (PaaS) for data visualization, workflows and predictive analytics.
Security
One of the primary trepidations companies have when migrating to the cloud is security. On-premise
datacenters allow for direct control over security measures that comply with company standards. Security
remains one of the primary pillars of Azure. With the Azure Security Center, customers can maintain peace
of mind with access to 24/7 unified security management across hybrid cloud networks.
With this Azure feature, customers can gain visibility to security across both on premise and cloud networks.
Users can automatically deploy standardized security measures when onboarding new applications. Any
security threats can be immediately remediated either using built-in Azure security assessments or
proprietary assessments. Through VMs Azure controls leverage machine learning to protect against malware
and viruses and can detect and respond swiftly to attacks.
Azure positions itself as not only a secure cloud network but an added security layer for all company
datacenters.
Scalability
Scaling involves a system adapting to a change in the amount of workload or traffic in a given
application. This involves both increasing and decreasing application flows. One of the greatest
features of Azure is its flexibility to scale. Scaling cloud networks involves creating or assigning
servers to an application. With Azure, scaling is logistically simpler than with traditional hosting as
primary servers do not need to be taken down. Companies gain benefits of scaling in the cloud as
they don’t need to add physical infrastructure in order to scale.
Disaster Recovery
With cybersecurity and risk management remaining at the forefront of IT and business, disaster recovery is
key for cloud services. Azure prioritizes disaster recover through unprecedented coverage in regional and
global fail-over options, hot and cold standby models and rolling reboot capabilities. This level of disaster
recovery can be difficult to achieve in on premise models but comes standard out of the box with Azure.
Cost savings continues to be one of the main drivers behind migrating to cloud services. Azure markets up
to 72% savings with pay-as-you-go pricing. Companies can also choose to go the hybrid route and maintain
their on premise data centers while reaping the benefits of flexibility and backup security in the Azure cloud.
Read about our partnership with leading supermarket company, Raley’s, and the cost savings realized by
migrating data to Azure. Microsoft Azure provided Raley’s with many advantages including, security,
scalability, flexibility, reliability, support, cost savings and the ability of Azure to provide a first class user
experience.
Compliance
Compliance can fall by the wayside when companies think about data. They would rather focus on tangible
business outcomes. However, compliance must work in lockstep with business and system development to
mitigate risk exposure and costly mistakes down the line! Microsoft holds more certifications than any other
cloud provider leading in the areas of security and privacy requirements including GDPR.
With Azure, companies can focus on end-to-end solutions for development. Azure focuses more on the
entire package than just storage. With Azure, customers gain access to an integrated delivery pipeline for
sourcing, testing, delivery and go-live. Taking advantage of Azure’s delivery pipeline allows for greater
business continuity and better integration of data and workflows. It also fosters opportunities to create a
true DevOps culture.
Disadvantages of Microsoft Azure Cloud Test Labs
Despite all the benefits of running a test lab on Azure, there are some Hyper-V features that are not
supported on Azure, along with disadvantages that are associated with running test labs in a remote
datacenter.
Microsoft provides images, which get updated on a regular basis and can be used to provision
virtual machines (VMs). If you manage the status of VMs carefully, costs can be kept to a minimum.
As long as servers are specified correctly for the task, VMs run significantly faster than anything I
could provision in my own office lab.
Naturally the expandability of the cloud allows organizations to provision test labs quickly, without
having to worry about costs or logistics of providing staff with physical hardware, and the ability to
manage and provision VMs using PowerShell is a plus.
CHAPTER 3
GANTT CHART
Secure FTP server solution using SSL/TLS. Fast deployment with secure access
Secure FTP Server is a full featured FTP server with support for secure SSL/TLS connections, IP security,
anti-FXP options, per-user speed limits, user groups and MODE-Z compression. It provides a plain, but easy
to use interface.
Secure FTP Server is a server that supports FTP and FTP over TLS which provides secure encrypted
connections to the server. This FTP VM image is deployed on Windows 2016 and is fully configured for quick
and easy deployment.
Secure FTP supports TLS, the same level of encryption supported by your web browser, to protect your data.
When using TLS your data is encrypted so that prying eyes cannot see it, and your confidential information is
protected. It also supports on-the-fly data compression, which can improve the transfer rates.
Features:
Ports
The following ports are required to be opened if you are using an NSG or firewall appliance:
21 (FTP)
990 (FTPS)
14147 (Optional - For FTP Server Admin Interface)
50000 - 51000 (Passive Mode for data transfer)
Please read the configuration steps that you'll need to perform after the install at : How to setup FTP
server on Azure
Once your Azure VM has been deployed there are some post configuration steps to complete to
start using this FTP Server
Login
Login using the credentials that were supplied during the VM creation
20
Passive Mode
You should now be connected. You may see connection errors and NAT errors, this is normal as we
need to complete some configuration. From the menu select
You’ll need to set a passive mode port range. Usually (50000-51000). These ports are used for data
transfers to the server.
Once you have a public IP address associated with the NIC on your azure VM, add the ip address to
the passive mode settings as shown below highlighted in yellow and also the passive port range:
Within the FileZilla server options, click on SSL/TLS settings. Check the Enable FTP over TLS
support (FTPS).
Next click on Generate New Certificate > Fill in your company information.
20
IMPORTANT – In the common name (Server address) field make sure to add the public DNS name
of your Azure VM. This can be found in the azure portal, as highlighted in yellow:
Save the key locally on the server and then press Generate certificate. No need to add a password.
Setup Users
There are 2 options:
1. Create local users and assign access
2. Integrate Active Directory and allow users to use their domain logins to authenticate
20
Option 2 (Active Directory Integrated)
Open settings > LDAP and select enable LDAP support. Beta.
Add your private ip address of your local domain controller. Add port 389 and write the name of
your domain name.
Next you need to add the users who need access to your FTP directories.
Select Edit > Users and here you’ll need to add the users full UPN that they use to logon to AD, for
example if their name is jsmith@yourdomain.com or your domain\smith we need to make sure we
add this so it matches their login UPN jsmith@yourdomain.com. We don’t need to add their
20
password here as it authenticates against Active Directory, so make sure the password checkbox is
unchecked.
Next check the boxes LOCAL and LDAP as in the screenshot below.
In the screenshot below I’ve added a test user from our AD called fuser and our AD domain is called
yourdomain.com
Next is to assign these users to your FTP directories they need access to. Click on Shared Folders
within the Users menu and add the local folders and assign permissions they need:
Now would be a good time to test if you can connect using an FTP client. If you can’t connect, try
the next step and to configure any NSG / Firewall rules.
20
Configure NSG Rules / Firewall Rules
If you have NSG’s or firewall appliances in Azure you will need to open access to the following
ports:
To allow clients to connect, users can use any FTP client. You can use FileZillas FTP Client
Support
For issues regarding setup of this solution, leave a message in the comments below
If you would like to use our managed azure service and let us take care of managing your VMs, get
in contact with us
Disclaimer
This FTP server solution is built using a modified version of FileZilla server open source software.
This solution is provided under GPLv2 license. The respective trademarks mentioned in the offering
are owned by the respective companies. No warrantee of any kind, express or implied, is included
with this software
– Use at your risk, responsibility for damages (if any) to anyone resulting from the use of this
software rest entirely with the user
– The author is not responsible for any damage that its use could cause.
20
Plan + Pricing
The cost of running this product is a combination of the selected software plan charges plus the
Azure infrastructure costs for the virtual machines on which you will be running this software. Your
Azure infrastructure price might vary if you have enterprise agreements or other discounts.
Starting at
$0.036/hour
The file contains 2 <publishProfile> sections. One is for Web Deploy and another for FTP.
Under the <publishProfile> section for FTP make a note of the following values:
publishUrl (hostname only)
userName ————————–> This is the information you are looking for
userPWD
Below is a publishsettings file from one of my test sites.
Every file has unique username and password. The user could also reset the password, however that is
beyond the scope of this post. I will discuss in another post altogether.
Launch FileZilla.
Under Site Manager click on New Site button and give it a descriptive name.
Under the General tab set the values for the following accordingly
Host: Paste the hostname from publishUrl obtained from the publishsettings file above.
20
Logon Type: set this to Normal.
User: Paste the userName obtained from the publishsettings file above.
Password: Paste the userPWD obtained from the publishsettings file above.
You would see two folder under the root: Logfiles and Site.
20
Log files folder as the name indicates provides storage for various logging options you see
under the CONFIGURE management page on the Azure Portal.
Site folder is where the application resides. To be more specific the code resides
here: /site/wwwroot
Thus, Azure Web Sites gives the user the flexibility to create/upload/download files/folder(s)
to their corresponding site via FTP.
Hardware Requirement:
The FTP/S endpoint for your app is already active. No configuration is necessary to enable FTP/S
deployment.
To open the FTP dashboard, click Deployment Center > FTP > Dashboard.
20
Get FTP connection information
In the FTP dashboard, click Copy to copy the FTPS endpoint and app credentials.
It's recommended that you use App Credentials to deploy to your app because it's unique to each
app. However, if you click User Credentials, you can set user-level credentials that you can use for
FTP/S login to all App Service apps in your subscription.
20
Deploy files to Azure
1. From your FTP client (for example, Visual Studio, Cyberduck, or WinSCP), use the connection
information you gathered to connect to your app.
2. Copy your files and their respective directory structure to the /site/wwwroot directory in
Azure (or the /site/wwwroot/App_Data/Jobs/ directory for WebJobs).
3. Browse to your app's URL to verify the app is running properly.
Enforce FTPS
For enhanced security, you should allow FTP over SSL only. You can also disable both FTP and FTPS
if you don't use FTP deployment.
In your app's resource page in Azure portal, select App settings in the left navigation.
To disable unencrypted FTP, select FTPS Only. To disable both FTP and FTPS entirely, select Disable.
When finished, click Save. If using FTPS only you must enforce TLS 1.2 or higher by navigating to
the SSL settings blade of your web app. TLS 1.0 and 1.1 are not supported with FTPS Only.
20
Create an App Service app and deploy files with FTP using Azure CLI
This sample script creates an app in App Service with its related resources, and then deploys a static
HTML page using FTP. For FTP upload, the script uses cURL as an example. You can use whatever
FTP tool to upload your files.
If you don't have an Azure subscription, create a free account before you begin.
If you choose to install and use the CLI locally, you need Azure CLI version 2.0 or later. To find the
version, run AZ --version. If you need to install or upgrade, see Install the Azure CLI.
20
Clean up deployment
After the sample script has been run, the following command can be used to remove the resource
group and all resources associated with it.
20
CHAPTER 5
SYSTEM DESIGN AND IMPLEMENT
20
5.1.2 Level: 1 DFD “Game”:
Level 1 DFD in dictates the main functions of the game. Player provides the game with login info,
Which is authenticated and matched with the corresponding personal info of the player.
This information is then passed to the Client Game Engine. Input Handler is the layer where player
Inputs are processed and converted into action information that is evaluated in Client Game Engine,
And with the necessary room data from there positron, is combined to scene data. This is combined
By Graphics module with 3Dmodel data to construct the 3D scene to bed is played to the player.
The actions performed by the player is transformed into events and sent to Server Game Engine by
Client Game Engine, which also receives the incoming events from the Server GameEngine.Server
Game Engine is the central part where events from clients are processed and distributed, highly
using the Game Data Repository as well as the configuration data from Developer. There exists a
further process of Chat Handler, which handles the chat message traffic among the human and AI
players.
21
5.1.3 Level: 2 DFD “Server Game Engine”:
This L2 DFD introduces a detailed view of Server Game Engine, which is the core process of the
game on the server side. First of all, Channel Resolver process is responsible for handling the
incoming event request traffic by addressing their quest to the relevant channels, which may be
different for each client. Event Dealer is the part where event information is converted to actual
events and sent to Event Order, where the events are ordered with respect to time and address
constraints. Those ordered events, with the game data and room information from there pository,
are processed in Game Mechanics module. The resultant event information is sent back to Event
Dealer, and is eventually sent to the relevant clients. Game Mechanics process also interacts with
AIEngine, sending and receiving event data while AIEngine is capable of sending
messages to ChatHandler.Finally, configuration data from Developer comes directly to Game
Mechanics and processed.
22
5.1.4 Level: 2DFD“ClientGameEngine”:
This L2 DFD shows the inner dynamics of Client Game Engine, which is the core process of the
game on the client side. Initially, authentication decision is sent to Client Game Mechanics. From
the none, action info from the Input Handler is validated and sent to Client Game Mechanics.
Being changed, current room state is sent to State Screen Transformer, combined with static room
info from the repository and sent to Graphics. Events are handled similar to those in server side:
event requests are made by Client Game Mechanics to Server Game Engine, and the incoming
event information is received in Event Dealer. After being converter to events, they are sent to
Event Order and ordered. Game Mechanics receives and processes the ordered events.
23
5.2 Use Case Diagrams:
The use case diagrams provided below go indirect correspondence with each of the functional requirements
item discussed in the Project Requirements section of there port. The reader is encouraged to revisit that
section, if necessary.
Items is an important issue in the game. There are two types of items in the game: Wall Items are non-
moveable items and can only be used, Inventory Items are moveable. An inventory item can be get, dropped,
equipped or unequipped. Only one item can be equipped in a time and only the equipped item can be used.
The last action is the Menu entrance in the game, player can enter the menu whenever he wants.
This use case displays the actions that AI player can do in the game.AI player in our game is a bit restricted, it can
only have movement actions which are walk, jump or crouch. Of course the directions are forward, backward, left
and right.
This use case displays the player’s login to the game. Authentication server checks player’s user name and password
and according to the check it either allows or disallows player’s entrance to the game.
This use case displays the things that player can do in the Main Access Menu. He can view his profile, edit and accept
it. He can also view his elapsed time in the cube. Other functionalities are entering and leaving the game.
This use case displays the chat usage in the game. Player can chat with other players in the game.
This chat can be public, which includes all players in a room, or can be private, which can only
Be done by two players. Player can also interact with AI players in the game, this interaction is of
Course limited.
Fig 5.2.5 Chat Use case
Microsoft Azure:
Introduction:
Microsoft Azure is Microsoft's cloud computing platform, providing a wide variety of
services you can use without purchasing and provisioning your own hardware. Azure
enables the rapid development of solutions and provides the resources to
accomplish tasks that may not be feasible in an on-premises environment. Azure's
compute, storage, network, and application services allow you to focus on building
great solutions without the need to worry about how the physical infrastructure is
assembled.
An online management portal provides the easiest way to manage the resources you deploy into Azure. You can use
this to create virtual networks, set up Web Apps, create VMs, define storage accounts, and so on, as listed in the
previous section. As noted earlier in this chapter, there are currently two versions of the portal. The production
portal is the Azure portal at https://portal.azure.com. Most features have been moved to the Azure portal, with
some exceptions such as Azure AD. The previous portal is called the classic Azure portal
(https://manage.windowsazure.com), and it can still be used to manage Azure AD and to configure and scale classic
resources such as Cloud Services.
Introduction:
FileZilla is a free and open source FTP client for Windows, Mac and Linux. It is developed and maintained by
Tim Kosse and the FileZilla team. Development started in 2001 and it has evolved to become one of the most
popular FTP clients in use today.
FTP
SFTP
FTP-SSL
Installation:
1. Download the desired edition of the FileZilla client. For use with ExaVault, the standard (free!) version of
the client will have all the features you need.
2. Double-click the downloaded file.
3. Follow the installation prompts. Use the default options for installation.
Hit Download.
Now Run Your FileZilla setup and install. After installation you got this window in your system.
A web page (also written as webpage) is a document that is suitable to act as a web resource on the World Wide
Web. When accessed by a web browser it may be displayed as a web page on a monitor or mobile device.
FileZilla is a free software, cross-platform FTP application, consisting of FileZilla Client and FileZilla Server. Client
binaries are available for Windows, Linux, and mac OS, server binaries are available for Windows only. Both server
and client support FTP and FTPS (FTP over SSL/TLS), while the client can in addition connect to SFTP servers.
Windows Azure supports deploying websites from remote computers using Web Deploy, FTP, GIT or TFS. Many
development tools provide integrated support for publication using one or more of these methods and may only
require that you provide the necessary credentials, site URL and hostname or URL for your chosen deployment
method.
Credentials and deployment URLs for all enabled deployment methods are stored in the website's publish profile, a
file which can be downloaded in the Windows Azure (Preview) Management Portal from the Quick Start page or
the quick glance section of the Dashboard page.
If you prefer to deploy your website with a separate client application, high quality open source GIT and FTP clients
are available for download on the Internet for this purpose.
Prerequisites:
Procedure:
Publish = code
Step 6: Then the main part of the web app is App service plan and Location to select the app service plan click on
create new.
Step 7: The benefit and costs of the app service plan varies between different tiers of web apps, which makes this
even more confusing. So before you start loading up your app service plans with multiple apps, make sure you are
aware of how your chosen plan implements this. At the present time there are 3 tiers in Azure Web Apps:
Dev/test
Production
Isolated
Step 8: In the app service plan Set app service plan name, location, and pricing tier
We set the location=south India, and pricing tier=free in dev/test.
Step 9: Set the app service plan and location click on create button to create a web app.
Step 10: After creating a web app go to aur wen app and click the quickstart.
Step 11:-After that click on deployment credentials to set the username and password.
Step 12: In the deployment credentials set the username password then click ok
Step 13: Then go to azure portal in our web app service click on deployment center.
Step 14: In the deployment center choose FTP for connect the web page to FileZilla server.
Step 15:-In the FTP again set User credentials to connect the FileZilla using username and password.
Step 16: After that download the FileZilla app for the browser then install the app to connect the azure web app click
on file menu the select the site manager.
Step 17: In the site manager click on the new site then select the protocol are FTP.
Then click on host name which are provide by azure web app services. Then click on encryption which are set as only
use as plain ftp (insecure).then username and password. Then click connect.
Step 18: then click ok to connect.
Step 19: In the FileZilla to upload the web page code go to local site select the file.
Step 20: Then select the file right click on the file then click upload the file.
Step 21:-After uploading a new file then the old file name will be change to show the new web page.
Step 22:-Then go our azure portal and go to our web service the copy the url which are provide by the web app.
Step 23:-Then go to browser paste the URL and click enter the web page will be displayed on the browser.
CHAPTER 6
MATHEMATICAL ALGORITHM
Step 5: In FileZilla there are many options like Host Name, User name, Password and port
number.
Step 6: Filling the all the option which are given in azure portal at App service.
Step 7: The uploading the folder where all gaming related html pages stored.
Step 8: The connecting with webserver by given URL web port.
Step 9: The given URL open on the web browser.
Step 10: The user login to the URL for playing the game, after the login the user choose the
game as per choice.
Step 11: On the MS-Azure portal user request are monitor and manage using load balancer.
Fig 6.1: Internal flow of command
In the above module when user give the command over the internet or cloud, command
interpreter sends the command to the game server which is lies over the cloud. Game server
send the response to the user command and display game which are available on server, then
user need to choose the game which want to play. once user select the game, Game server send
the screen on user device, video will capture on user device, video encoder send the frames on
user device, and decodes on the web portal at user level.
CHAPTER 7
TEST SPECIFICATIONS
The general aim of testing is to affirm the quality of software systems by systematically
exercising the software in carefully controlled circumstances. Code-based testing (also
known as white box testing or structural testing) refers to the use of source code for
planning the test cases. Mostly developers perform Code-based testing. Specification-
based testing (or Black box testing) is a testing method in which the tester did not know
anything about the internal structure, design or implementation. Black box test cases are
derived from the design documents. Black box testing mainly tests the functionality of
the software.
TestcaseID 1
Testcase name Login to theAzurePortal
Testcase process RunWebapp
Test steps Browse the
URLSelect
thegamePre
Status Gamewillload
ssplayingopt
Finalresult Press playoption
ion
TestcaseID 2
Testcase name Login to the appservice
Testcase process Restart the service
Test steps No. ofrequestsendbyuser
Monitoringthe incoming request
inwebsiteHTTPservererroroccur
Status Reduced the errorbyload balancer
Finalresult Successfullyhandled theincomingrequestload
CHAPTER 8
RESULT AND ANALYSIS
In this article, we grouped the existing gaming model new way. We study the new technique and try to reduce trouble
while playing game. We describe the design model of gaming API for better gaming experience with Mess-up of cloud
resource. We gave a brief history of cloud gaming services, followed by the design decisions made by representative
commercial cloud gaming services. Without these optimizations, service provider cannot consolidate enough cloud
gaming users to each physical machine. This in turn leads to much lower profits, and may drive the service provider
out of business. In summary, the advances of technologies turn playable cloud gaming services into reality; more
optimization techniques gradually make cloud gaming services profitable; hence, we believe that we are on the edge
of a new era of a whole new cloud gaming ecosystem, which will eventually leads to the next generation cloud gaming
services.
FUTURE SCOPE
With the growth in cloud technology, now the game is available via a cloud host or app, which is giving a benefit that
the game has crossed international borders without any import or export taxes, tariffs, or shipping fees. Online
gaming systems, which mix various multimedia such as image, video, audio, and graphics to enable players to
interact with each other over the Internet, are now widely used not just for entertainment, but also for socializing,
business, commerce, scientific experimentation, and many other practical purposes. Gaming is now a multi-billion
dollar industry all over the world.
We are trying to work on implementing the modules in cloud gaming such as Streaming Direct 3Dgames, Supports to
play on different GUI Games, Scheduling Handling, Processing performance of the server.
BetOnSoft
BetOnSoft
Posted: 06-03-2018
Online Gaming Firm Implements Real-Time Analytics and Scales for Planned Growth
BetOnSoft develops and manages more than 110 online casino games, played every day by thousands of players
worldwide. The company needed to ensure that its games are highly available, because players are online around
the clock. BetOnSoft also wanted to prepare for business growth by scaling its database while maintaining
application responsiveness. In addition, its applications must perform key business-critical analytics in real time. In
November 2011, the company deployed a hybrid application solution that takes advantage of the high-availability
features in Microsoft SQL Server 2012 Always On and the scalability of SQL Azure. Now, the company’s infrastructure
can exceed 10 times its previous peak loads while running intensive real-time data analytics. BetOnSoft has achieved
the availability it needs and can use its hybrid infrastructure to scale to meet unexpected business growth.
"SQL Server 2012 Enterprise with Always On gives us exactly the performance we need. We can exceed 10 times the previous
peak game-play load.... and still run intensive analytics in real time."
Business Needs
BetOnSoft, an international gaming software provider with presence in 11 countries, is a fast-growing developer of
popular online casino games used by players around the world. The company provides gaming software and
hardware infrastructure to independent operators that market and brand the games. Over the past two years, the
company has launched new operators into the marketplace and acquired existing operators from other software
providers through its superior platform and products.
Currently, BetOnSoft offers more than 110 single-player online games, including slot machines, roulette, blackjack,
and craps. These games can be played on computers or mobile devices.
As a growing player in the e-gaming software market, BetOnSoft needs to maintain high availability for its mission-
critical gaming applications in order to achieve business success. Their operators market to an international player
base, and so there are always players online, 24 hours a day, and seven days a week.
Availability was sometimes challenging because when BetOnSoft database administrators would run intensive
maintenance operations such as checking the database for corruption, application timeouts would often occur.
Additionally, when the company would deploy new software, administrators sometimes had to take the application
server offline.
To be competitive, BetOnSoft must be able to be agile and innovative in its technology approach, so it can handle rapid growth
in the number of users playing its games. In fact, as the number of operators using
BetOnSoft services increases, it is likely that aggressive marketing on any given day would create sudden high demand. To
handle such scenarios, BetOnSoft needs the ability to rapidly scale up or down. “We have more than doubled the number of
operators in the last 12 months,” says Thomas Pullen, Database Administrator, and BetOnSoft. “And our expectations are that
we will continue to grow. We needed to make sure that our database software and servers had the capacity to scale rapidly.”
BetOnSoft also sought to out-innovate its competitors by implementing rich functionality for operators and players
alike, much of which depends on complex data analysis to produce results in real-time.
To increase availability, scalability, and performance for its multi-terabyte database, in early 2011 BetOnSoft decided
to implement a new technology solution.
Solution
BetOnSoft began deploying a new solution in July 2011, when it upgraded its database servers and database
software. At that time, the company implemented Microsoft SQL Server 2008 R2 x64 Enterprise data management
software on two Dell PowerEdge R810 server computers with four 8-core processors and 256 megabytes of RAM.
Each server contains a 640-gigabyte and a 1.2-terabyte memory card made by Fusion-io, a storage-memory company
based in Salt Lake City, Utah. Fusion-io memory cards can improve processing capabilities in a data center by
relocating active data from centralized storage to the server where it is being processed. This can help reduce
latency while also increasing data-center efficiency.
In late 2011, BetOnSoft decided to upgrade further to Microsoft SQL Server 2012 Enterprise. “We had been very
happy with SQL Server 2008 R2 overall,” says Pullen. “But we saw features in SQL Server 2012 that we knew would
help us with availability, scalability, and performance.”
One of those features is SQL Server 2012 AlwaysOn, a new high-availability and disaster-recovery solution through
which customers can query data in replica databases and conduct backup operations from those replicas. AlwaysOn
includes availability groups that support a failover environment for a set of user databases that fail over collectively.
This feature also includes the AlwaysOn availability group listener, which contributes to easier application server
configuration and redundancy.
Additionally, AlwaysOn provides readable database mirror capabilities. The replica databases provide read-only access for use
in reporting and backup, which serves to offload some of the primary server’s workload.
“SQL Server 2012 AlwaysOn was the key driver for us,” Pullen says. “Between the availability groups, the readable mirror for
offloading reporting and database checking, and the listener, we knew we would increase our availability with SQL Server
2012.”
“Typical data architectures for e-commerce applications involve a high-throughput online transaction processing (OLTP)
database from which data is fed into a downstream data warehouse,” says Devan Go vender, Chief Software Architect,
BetOnSoft. “Data analysis is then usually run on the warehouse, which can be several seconds or even minutes behind. Even
small delays are not tolerable in the market-leading gaming products we are building.”
BetOnSoft architected its applications around SQL Server 2012 AlwaysOn, Fusion-io storage, and strategic hardware and
network configurations to take advantage of the benefits provided by this platform and achieve its product and performance
goals. Go vender says, “SQL Server 2012 AlwaysOn is a key part of our solution to achieve real-time results.”
Prior to implementing SQL Server 2012, BetOnSoft held two series of testing engagements in Oxford, UK in April and May 2011.
Then, in September 2011, BetOnSoft worked with Microsoft to test SQL Server 2012 in a production environment at a
laboratory session at Microsoft headquarters in Redmond, Washington.
The goal of the Redmond lab was to validate that the technology could support at least 10 times current gameplay workload at
BetOnSoft, while still able to perform intensive data analytics in real-time. “We wanted to make sure that the technology could
give us an extra level of availability without any performance penalty to the players,” says Pullen. “And, ultimately, we wanted
to make sure that the solution supported future scale-up throughput requirements that fell within our acceptable application
response times.”
During the lab, BetOnSoft installed SQL Server 2012 instances on each server, activated the AlwaysOn features and set up
availability groups and synchronous secondary instances while activating reporting from a readable database mirror. It also
conducted failover testing. “We really wanted to run a stress test on the availability groups,” states Pullen. “We were driving
SQL Server 2012 to exceed 10 times our peak production load.”
BetOnSoft is also running several critical services on Microsoft SQL Azure, a cloud-based data-storage environment
that provides high availability by storing multiple database copies and providing fast provisioning.
For example, BetOnSoft maintains its error-reporting service in the Windows Azure cloud with data stored in SQL
Azure databases. This service monitors, by geographic region, the number of players worldwide that are
experiencing problems launching or playing games on their computers. “For some services, such as error reporting, it
makes sense to manage that outside the data center,” says Govender. “For example, there could be issues with the
data center that make it inaccessible for error reporting.”
BetOnSoft also runs certain marketing applications on Windows Azure, where demand can spike as a result of
campaigns run by marketing partners. “It was a no-brainer for us to run services that have unpredictable demand in
the cloud,” says Govender, “We scale up to meet demand and back down when demand subsides.”
Another Windows Azure service is used to collect statistics on the quality of connections to the company’s games.
Statistics are collected for download rates, latency, and number of connection errors.
BetOnSoft also has a Windows Azure monitoring service that collects data on transaction rates, the number of games played,
and other activity metrics in a SQL Azure database. It constantly analyses these metrics to detect and send alerts about any
anomalies that require attention.
Benefits
With the new SQL Server 2012 solution, BetOnSoft can process more than 10 times its previous peak workload while
running real-time data analysis. The solution also increases availability and gives BetOnSoft the capacity to scale for
growth. Additionally, the company has easier IT administration and can provide better service to its operators.
During lab testing, BetOnSoft were able to exceed their target of 10 times their current production workload. “SQL Server 2012
Enterprise with AlwaysOn gives us exactly the performance we need,” says Pullen. “We can sustain more than 10 times the
current peak game-play load and still run intensive analytics in real-time.”
This performance is aided by the Fusion-io memory card, which contributes to low database latency because it does
not rely on SAN storage. “Using local attached storage helps BetOnSoft get the throughput it needs with SQL Server
2012,” says Josh Miner, Director of Product Marketing, Fusion-io. “With reduced latency, the server computers get
data faster and can process that data hundreds of times per millisecond. That contributes to faster and more
consistent response times for BetOnSoft game players.”
SQL Server 2012 AlwaysOn gives BetOnSoft the enterprise-level robustness it needs to ensure high availability for the
company’s mission-critical online gaming applications. “Before we upgraded our servers and implemented SQL Server 2012, I
could not regularly check the database. Whenever I did, it would cause application timeouts,” says Pullen. “Now, with the high
availability we get from SQL Server 2012 AlwaysOn, I can check the database every week, and I can be confident that the
database is corruption free.” BetOnSoft also checked automatic failover time during testing. “We were prepared to accept a
time of two minutes, and it only took 14 seconds, which was a huge win for us,” says Pullen.
Taking advantage of AlwaysOn availability groups, BetOnSoft can also deploy new game features faster than before. “With the
availability group listener, for example, multiple application servers can be configured identically, no matter where the
database is running,” says Pullen. “That further increases availability and helps us avoid downtime when we deploy new
software.”
The company also ensures high availability from its Windows Azure monitoring services. “We’re using SQL Azure for our core
monitoring services, and it helps us ensure the highest availability for our critical services,” says Govender.
SQL Azure also enhances security for those services. “The firewall and security configurations in SQL Azure are great,” says
Govender. “Our cloud-based services are now as secure as our data-center-based services.”
When BetOnSoft tested its new solution prior to going live, it confirmed that SQL Server 2012 could sustain the level of
throughput needed to meet future business growth. “We now have the capacity to add a lot more players and operators while
not losing any application responsiveness,” says Pullen. “If our customer base grows by 10 times, we know we’ll still have great
performance with SQL Server 2012 AlwaysOn.”
And with Windows Azure, BetOnSoft has an added layer that it can use to scale to handle unexpected high demand
for its services. With that capability, BetOnSoft can better compete in the online gaming marketplace. “In our
business, responsiveness and scalability are very important, because we need to retain the same, fast application
performance while more players are playing the games,” Pullen says. “We want to grow our business, and SQL
Server 2012 positions us to do that.”
Simplifies Administration
SQL Server 2012 features like AlwaysOn availability groups and the availability groups listener, which support easier
server configuration and failover management capabilities, will help simplify administration for BetOnSoft database
administrators. “Using the readable mirror in SQL Server 2012, I can check the database frequently and easily, as
well as offload reporting,” says Pullen. “That really reduces time and effort for me, making my job easier from a
management perspective.”
In addition, SQL Server 2012 helped BetOnSoft enhance the services it provides to the operators that run the company’s games.
For instance, BetOnSoft is now able to provide fraud detection, VIP identification, and marketing campaign analysis services to
operators in real time. “Having the SQL Server 2012 readable database mirror functionality makes this easier for us to do,
because we can provide access to reporting data without compromising the primary server,” says Pullen.
With its SQL Azure–based services, the company can use a leaner infrastructure overall and is also gaining valuable
metrics that can be used to improve the user experience. “It’s very valuable for us to see that someone in the United
Kingdom is having a great download experience, while a player in India is having a bad one,” says Govender. “It gives
us a complete and detailed view of the global player experience.”
The company also can better detect issues and anomalies. “We use our SQL Azure–based monitoring service to see where the
problems are and if there are certain trends,” says Govender. “We use these metrics to enhance our services to improve the
overall user experience.”
Ultimately, SQL Server 2012 fulfilled all of the company’s requirements during testing, which confirmed that it was the right
technology to align with BetOnSoft business goals. “We would not have gone live with SQL Server 2012 if we hadn’t had that
success in the testing phase,” says Pullen. “Those results showed us that we were implementing the right technology to meet
our business growth and maintain the high availability and strong performance we need to be competitive in online gaming.”