Jump to content

proximagr

Moderators
  • Posts

    2468
  • Joined

  • Last visited

  • Days Won

    12

Blog Entries posted by proximagr

  1. proximagr
    Connect two or more Azure Virtual Networks using one VPN Gateway
    Peering is a feature that allows to connect two or more virtual networks and act as one bigger network. At this post we will see how we can connect two Azure Virtual Networks, using peering and access the whole network using one VPN Gateway. We can connect Virtual Networks despite if they are in the same Subscription or not.
    I have created a diagram to help understand the topology.

    We have a Virtual Network with Site-2-Site VPN wto On Premises. It can also have Point-2-Site connection configured. The VNET A. We have another Virtual Network at the Same Subscription that we want to connect each other. The VNET B. Also we can have a third Virtual Network at a different subscription. The VNET C.

    In sort we need those peerings with the specific settings:
    At the VNETA Peering VNETA to VNETB with “Allow Gateway transit” At the VNETA Peering VNETA to VNET At the VNETB Peering VNETB to VNETA with “Use Remote Gateway” At the VNETB Peering VNETB to VNETC At the VNETC Peering VNETC to VNETA with “Use Remote Gateway” At the VNETC Peering VNETC to VNETB

    In order to be able to connect all those networks and also access them using the VPN Connection there are four requirements:
    The account that will be used to create the peering must have the “Network Contributor” Role. The Address Space must be different on each other and not overlap. All other Virtual Networks, except the one that has the VPN Connection must NOT have a VPN Gateway deployed. Of course at the local VPN device (router) we need to add the address spaces of all the Virtual Networks that we need to access.
    Lets lab it:
    HQ 192.168.0.0/16 –> The on-premises network VNET A 10.1.0.0/16 –> The Virtual Network that has the VPN Gateway (At my lab is named “devvn”) VNET B 10.229.128.0/24 –> THe virtual network at a different subscription of the Gateway (At my lab is named “Network prtg-rsg-vnet”) VNET C 172.16.1.0/24 –> The virtual network at the same subscription as the Gateway Network (At my lab is named “provsevnet)

    The on-premises network is connected with Site-to-site (IPsec) VPN to the VNETA

    Now we need to connect VNETA and VNETB using Vnet Peering. in order to have a Peering connection we need to create a connection from VNETA to VNETB and one from VNETB to VNETA.
    Open the VNETA Virtual Network, go to the Peerings setting and press +ADD
    Select the VNETB and check the “Allow Gateway transit” to allow the peer virtual network to use your virtual network gateway


    Then go to the VNETB, go to the Peerings setting and click +ADD.
    Select the VNETA Virtual Network and check the “Use Remote Gateway” to use the peer’s virtual network gateway. This way the VNETB will use the VNETA’s Gateway.


    Now we can contact the VNETB network from our on-premises network
    a multi-ping screenshot:
    From 10.229.128.5 (VNETB) to 192.168.0.4 (on-premises) & the opposite From 10..1.2.4 (VNETA) to 10.229.128.5 (VNETB) & to 192.168.0.4 (on-premises)

    The next step is to create a cross-subscription peering VNETA with VNETC
    Open the VNETA and create a peering by selecting the VNETC from the other Subscription and check the “allow gateway transit”

    Then go to the VNETC and create a peer with the VNETA and check the “use remote gaeway”

    With the two above connections we have connectivity between the on-premises network and the VNETC.
    The final step, to enable the connectivity between VNETB & VNETC. To accomplish this just create one peer from the VNETB to VNETC and one from VNETC to VNETB.
    Ping inception:

    In order to have client VPN connectivity to the whole network, create a Point-2-Site VPN at the VNETA. You can follow this guide: Azure Start Point | Point-to-Site VPN
  2. proximagr
    Azure Storage Advanced Thread Protection
    Azure Storage Advanced Threat Protection is a new security feature, currently in Preview. It monitors the Azure Blob Storage accounts. It detects anomalies and uncommon access to the Storage Account and notifies the admins through email.
    All the Azure Storage Advanced Threat Protection monitoring and logs are integrated to the Azure Security Center, including the well known ASC recommendations.
    It’s so easy to enable, just go to the Azure Portal, navigate to your storage account’s Advanced Threat Protection setting and switch it ON!

    After that you can view the alerts at the Security Center, under Threat Protection’s Security Alerts.


    First published at https://www.e-apostolidis.gr/microsoft/azure/azure-storage-advanced-thread-protection/
  3. proximagr
    Connect two or more Azure Virtual Networks using one VPN Gateway
    Peering is a feature that allows to connect two or more virtual networks and act as one bigger network. At this post we will see how we can connect two Azure Virtual Networks, using peering and access the whole network using one VPN Gateway. We can connect Virtual Networks despite if they are in the same Subscription or not.
    I have created a diagram to help understand the topology.

    We have a Virtual Network with Site-2-Site VPN wto On Premises. It can also have Point-2-Site connection configured. The VNET A. We have another Virtual Network at the Same Subscription that we want to connect each other. The VNET B. Also we can have a third Virtual Network at a different subscription. The VNET C.

    In sort we need those peerings with the specific settings:
    At the VNETA Peering VNETA to VNETB with “Allow Gateway transit” At the VNETA Peering VNETA to VNET At the VNETB Peering VNETB to VNETA with “Use Remote Gateway” At the VNETB Peering VNETB to VNETC At the VNETC Peering VNETC to VNETA with “Use Remote Gateway” At the VNETC Peering VNETC to VNETB

    In order to be able to connect all those networks and also access them using the VPN Connection there are four requirements:
    The account that will be used to create the peering must have the “Network Contributor” Role. The Address Space must be different on each other and not overlap. All other Virtual Networks, except the one that has the VPN Connection must NOT have a VPN Gateway deployed. Of course at the local VPN device (router) we need to add the address spaces of all the Virtual Networks that we need to access.
    Lets lab it:
    HQ 192.168.0.0/16 –> The on-premises network VNET A 10.1.0.0/16 –> The Virtual Network that has the VPN Gateway (At my lab is named “devvn”) VNET B 10.229.128.0/24 –> THe virtual network at a different subscription of the Gateway (At my lab is named “Network prtg-rsg-vnet”) VNET C 172.16.1.0/24 –> The virtual network at the same subscription as the Gateway Network (At my lab is named “provsevnet)

    The on-premises network is connected with Site-to-site (IPsec) VPN to the VNETA

    Now we need to connect VNETA and VNETB using Vnet Peering. in order to have a Peering connection we need to create a connection from VNETA to VNETB and one from VNETB to VNETA.
    Open the VNETA Virtual Network, go to the Peerings setting and press +ADD
    Select the VNETB and check the “Allow Gateway transit” to allow the peer virtual network to use your virtual network gateway


    Then go to the VNETB, go to the Peerings setting and click +ADD.
    Select the VNETA Virtual Network and check the “Use Remote Gateway” to use the peer’s virtual network gateway. This way the VNETB will use the VNETA’s Gateway.


    Now we can contact the VNETB network from our on-premises network
    a multi-ping screenshot:
    From 10.229.128.5 (VNETB) to 192.168.0.4 (on-premises) & the opposite From 10..1.2.4 (VNETA) to 10.229.128.5 (VNETB) & to 192.168.0.4 (on-premises)

    The next step is to create a cross-subscription peering VNETA with VNETC
    Open the VNETA and create a peering by selecting the VNETC from the other Subscription and check the “allow gateway transit”

    Then go to the VNETC and create a peer with the VNETA and check the “use remote gaeway”

    With the two above connections we have connectivity between the on-premises network and the VNETC.
    The final step, to enable the connectivity between VNETB & VNETC. To accomplish this just create one peer from the VNETB to VNETC and one from VNETC to VNETB.
    Ping inception:

    In order to have client VPN connectivity to the whole network, create a Point-2-Site VPN at the VNETA. You can follow this guide: Azure Start Point | Point-to-Site VPN
    If you like my content you can follow my blog: e-apostolidis.gr
  4. proximagr
    Azure Backup | Enable backup alert notifications
    Azure Backup generates alerts for all backup events, such as unsuccessful backups. A new option is to create backup alert notifications so Azure Backup will alert you firing an email when an alert is generated.
    To enable the backup alert notifications, navigate to the “Backup Alerts” section of the “recovery Services vault” and click the “Configure notifications”

    There switch the Email notification to On to enable the alerts. Enter one or more recipients separated with semicolon (. Choose Per Alert or Hourly Digest. Per Alert will fire an email for every alert instantly and the Hourly Digest means that the notification agent will check for alerts every hour and will fire an email with the active alerts.
    Finally choose the Severity of the alerts which you will be notified and press save.


    If you like my content you can follow my blog: e-apostolidis.gr
  5. proximagr
    Azure Portal | Virtual Machines bulk actions
    Azure Portal is a great GUI tool to administer all your Azure Resources and it continues evolving. Here is a very useful Tip. Did you know that you can manage Virtual Machines in bulk using the Azure Portal VIrtual Machines section? We have virtual machines bulk actions!

    Not only we can Assign Tags, Start, Restart, Stop and Delete Virtual Machines in bulk but also configure Change Tracking, Inventory and Update Management!!
    Filter out the Virtual Machines needed and just click the “Change Tracking” to have a report off all changes that happens inside the VM, like changes to services for Windows, daemons for Linux, applications and file changes.
    Use the “Inventory” to have a complete inventory of all the installed applications of the VM. Enable consistent control and compliance of these virtual machines.
    Enable the “Update Management” to manage the Updates of the selected Virtual Machines. Create update policies and control the installation of the updates.
  6. proximagr
    Application Security Groups to simplify your Azure VMs network security
    Application Security Groups helps to manage the security of the Azure Virtual Machines by grouping them according the applications that runs on them. It is a feature that allows the application-centric use of Network Security Groups.

    An example is always the best way to better understand a feature. So let’s say that in a Subnet we have some Web Servers and some Database Servers. The access rules of the Subnet’s Network Security Group to allow http, https & database access to those servers will be something like this:

    Using only the Network Security Groups functionality we need to add the IP addresses of the servers to use them to the access lists. There are two major difficulties here:
    For every rule we need to add all the IPs of the servers that will be included. If there is an IP address change (e.g by adding or removing a server) then all the relative rules must change.
    Use Application Security Groups
    Now, lets see how we can bypass this complexity by using Application Security Groups, combined with Network Security Groups.
    Create two Application Security Groups, one for the Web Servers and one for the Database Servers
    At the Azure Portal, search for Application Security Groups

    Provide a name and a Resource Group

    Create one more with name Database Servers and at the Resource Group you will have those two Application Security Groups:

    Then go each Virtual Machine and attach the relevant ASG.
    Click the Virtual Machine and then go to the Networking settings blade, and press the “Configure the application security groups”

    Select the relevant ASG and press save:

    Do the same for all your servers. Finally open the Network Security Group. Open the https rule, at my example is the “https2WebServers” rule. Change the Destination to “Application Security Group” and for Destination application security group select the Web Servers.

    Same way change the database access rule and for Source add the “Database Server” ASG and for destination the “Web Servers” ASG. Now the NSG will look like this:

    Now on when removing a VM from the Web Servers farm of the Database servers cluster there is no need to change anything at the NSG. When adding a new VM, the only thing we need to do is to attach the VM to the relative Application Security Group.
    A Virtual Machine can be attached to more than one Application Security Group. This helps in cases of multi-application servers.
    There are only two requirements:
    All network interfaces used in an ASG must be within the same VNet If ASGs are used in the source and destination, they must be within the same VNet

  7. proximagr
    Ασφαλίστε την MySQL και την PostgreSQL με τη χρήση Service Endpoints
    Σε προηγούμενο post, Ασφάλισε την Azure SQL Database μέσα σε ένα VNET χρησιμοποιώντας service endpoints, είδαμε πως μπορούμε να χρησιμοποιήσουμε τα Service Endpoints του Azure Virtual Network για να ασφαλίσουμε μια Azure SQL για πρόσβαση μόνο από εσωτερικό δίκτυο.
     

    Σήμερα, το Microsoft Azure, ανακοίνωσε την γενική διαθεσιμότητα του Service Endpoints για MySQL και PostgreSQL. Αυτό δίνει την δυνατότητα να κόψουμε όλη την Public πρόσβαση στις MySQL & PostgreSQL και να επιτρέψουμε μόνο πρόσβαση απο το εσωτερικό μας δίκτυο. Φυσικά μπορεί να οριστεί συγκεκριμένο Subnet ή Subnets. Επίσης δεν υπαρχει επιπλέων χρέωση για την χρήση των Service Endpoint.
     

    Περισσότερα μπορείτε να δείτε στο Microsoft Azure Blog: Announcing VNet service endpoints general availability for MySQL and PostgreSQL
  8. proximagr
    MICROSOFT AZURE BLOG: WHAT IS ARTIFICIAL INTELLIGENCE?
    August 10, 2018 Pantelis Apostolidis Azure Leave a comment
     
    This post is reposted from the Microsoft Azure Blog : What is Artificial Intelligence? <azure.microsoft.com/blog/what-is-artificial-intelligence/>
    Aug 9th 2018, 12:00, by Theo van Kraay
    It has been said that Artificial Intelligence will define the next generation of software solutions. If you are even remotely involved with technology, you will almost certainly have heard the term with increasing regularity over the last few years. It is likely that you will also have heard different definitions for Artificial Intelligence offered, such as:
    *“The ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.”* – Encyclopedia Britannica
    *“Intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans.”* – Wikipedia
    How useful are these definitions? What exactly are “tasks commonly associated with intelligent beings”? For many people, such definitions can seem too broad or nebulous. After all, there are many tasks that we can associate with human beings! What exactly do we mean by “intelligence” in the context of machines, and how is this different from the tasks that many traditional computer systems are able to perform, some of which may already seem to have some level of *intelligence* in their sophistication? What exactly makes the *Artificial Intelligence* systems of today different from sophisticated software systems of the past?

    It could be argued that any attempt to try to define “Artificial Intelligence” is somewhat futile, since we would first have to properly define “intelligence”, a word which conjures a wide variety of connotations. Nonetheless, this article attempts to offer a more accessible definition for what passes as Artificial Intelligence in the current vernacular, as well as some commentary on the nature of today’s AI systems, and why they might be more aptly referred to as “intelligent” than previous incarnations.
    Firstly, it is interesting and important to note that the technical difference between what used to be referred to as Artificial Intelligence over 20 years ago and traditional computer systems, is close to zero. Prior attempts to create intelligent systems known as *expert systems* at the time, involved the complex implementation of exhaustive rules that were intended to approximate* intelligent behavior*. For all intents and purposes, these systems did not differ from traditional computers in any drastic way other than having many thousands more lines of code. The problem with trying to replicate human intelligence in this way was that it requires far too many rules and ignores something very fundamental to the way *intelligent beings* make *decisions*, which is very different from the way traditional computers process information.
    Let me illustrate with a simple example. Suppose I walk into your office and I say the words “Good Weekend?” Your immediate response is likely to be something like “yes” or “fine thanks”. This may seem like very trivial behavior, but in this simple action you will have immediately demonstrated a behavior that a traditional computer system is completely incapable of. In responding to my question, you have effectively dealt with ambiguity by making a prediction about the correct way to respond. It is not certain that by saying “Good Weekend” I actually intended to ask you whether you had a good weekend. Here are just a few possible* intents* behind that utterance:
    – Did you have a good weekend? – Weekends are good (generally). – I had a good weekend. – It was a good football game at the weekend, wasn’t it? – Will the coming weekend be a good weekend for you?
    And more.

    The most likely intended meaning may seem obvious, but suppose that when you respond with “yes”, I had responded with “No, I mean it was a good football game at the weekend, wasn’t it?”. It would have been a surprise, but without even thinking, you will absorb that information into a mental model, correlate the fact that there was an important game last weekend with the fact that I said “Good Weekend?” and adjust the probability of the expected response for next time accordingly so that you can respond correctly next time you are asked the same question. Granted, those aren’t the thoughts that will pass through your head! You happen to have a neural network (aka “your brain”) that will absorb this information automatically and *learn* to respond differently next time.
    The key point is that even when you do respond next time, you will still be making a prediction about the correct way in which to respond. As before, you won’t be certain, but if your prediction *fails* again, you will gather new data which leads to my definition of Artificial Intelligence:
    “Artificial Intelligence is the ability of a computer system to deal with ambiguity, by making predictions using previously gathered *data*, and learning from errors in those predictions in order to generate newer, more accurate predictions about how to behave in the future”.
    This is a somewhat appropriate definition of Artificial Intelligence because it is exactly what AI systems today are doing, and more importantly, it reflects an important characteristic of human beings which separates us from traditional computer systems: human beings are prediction machines. We deal with ambiguity all day long, from very trivial scenarios such as the above, to more convoluted scenarios that involve *playing the odds* on a larger scale. This is in one sense the essence of *reasoning*. We very rarely know whether the way we respond to different scenarios is absolutely correct, but we make reasonable predictions based on past experience.
    Just for fun, let’s illustrate the earlier example with some code in R! First, lets start with some data that represents information in your mind about when a particular person has said “good weekend?” to you.

    In this example, we are saying that *GoodWeekendResponse* is our *score label* (i.e. it denotes the appropriate response that we want to predict). For modelling purposes, there have to be at least two possible values in this case “yes” and “no”. For brevity, the response in most cases is “yes”.
    We can fit the data to a logistic regression model:
    library(VGAM) greetings=read.csv(‘c:/AI/greetings.csv’,header=TRUE) fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings)
    Now what happens if we try to make a prediction on that model, where the expected response is different than we have previously recorded? In this case, I am expecting the response to be “Go England!”. Below, some more code to add the prediction. For illustration we just hardcode the new input data, output is shown in bold:
    response <- data.frame(FootballGamePlayed=”Yes”, WorldCup=”Yes”, EnglandPlaying=”Yes”, GoodWeekendResponse=”Go England!!”) greetings <- rbind(greetings, response) fit <- vglm(GoodWeekendResponse~., family=multinomial, data=greetings) prediction <- predict(fit, response, type=”response”) prediction index <- which.max(prediction) df <- colnames(prediction) df[index] * No Yes Go England!! 1 3.901506e-09 0.5 0.5 > index <- which.max(prediction) > df <- colnames(prediction) > df[index] [1] “Yes”*
    The initial prediction “yes” was wrong, but note that in addition to predicting against the new data, we also incorporated the actual response back into our existing model. Also note, that the new response value “Go England!” has been *learnt*, with a probability of 50 percent based on current data. If we run the same piece of code again, the probability that “Go England!” is the right response based on prior data increases, so this time our model *chooses* to respond with “Go England!”, because it has finally learnt that this is most likely the correct response!
    * No Yes Go England!! 1 3.478377e-09 0.3333333 0.6666667 > index <- which.max(prediction) > df <- colnames(prediction) > df[index] [1] “Go England!!”*
    Do we have Artificial Intelligence here? Well, clearly there are different *levels* of intelligence, just as there are with human beings. There is, of course, a good deal of nuance that may be missing here, but nonetheless this very simple program will be able to react, with limited accuracy, to data coming in related to one very specific topic, as well as learn from its mistakes and make adjustments based on predictions, without the need to develop exhaustive rules to account for different responses that are expected for different combinations of data. This is this same principle that underpins many AI systems today, which, like human beings, are mostly sophisticated prediction machines. The more sophisticated the machine, the more it is able to make accurate predictions based on a complex array of data used to *train* various models, and the most sophisticated AI systems of all are able to continually learn from faulty assertions in order to improve the accuracy of their predictions, thus exhibiting something approximating human *intelligence*. Machine learning
    You may be wondering, based on this definition, what the difference is between *machine learning* and *Artificial intelligence*? After all, isn’t this exactly what machine learning algorithms do, make predictions based on data using statistical models? This very much depends on the definition of *machine learning*, but ultimately most machine learning algorithms are* trained* on static data sets to produce predictive models, so machine learning algorithms only facilitate part of the dynamic in the definition of AI offered above. Additionally, machine learning algorithms, much like the contrived example above typically focus on specific scenarios, rather than working together to create the ability to deal with *ambiguity* as part of an *intelligent system*. In many ways, machine learning is to AI what neurons are to the brain. A building block of intelligence that can perform a discreet task, but that may need to be part of a composite *system* of predictive models in order to really exhibit the ability to deal with ambiguity across an array of behaviors that might approximate to *intelligent behavior*. Practical applications
    There are number of practical advantages in building AI systems, but as discussed and illustrated above, many of these advantages are pivoted around “time to market”. AI systems enable the embedding of complex decision making without the need to build exhaustive rules, which traditionally can be very time consuming to procure, engineer and maintain. Developing systems that can “learn” and “build their own rules” can significantly accelerate organizational growth.
    Microsoft’s Azure cloud platform offers an array of discreet and granular services in the AI and Machine Learning domain <docs.microsoft.com/en-us/azure/#pivot=products&panel=ai>, that allow AI developers and Data Engineers to avoid re-inventing wheels, and consume re-usable APIs. These APIs allow AI developers to build systems which display the type of *intelligent behavior* discussed above.
    If you want to dive in and learn how to start building intelligence into your solutions with the Microsoft AI platform, including pre-trained AI services like Cognitive Services and the Bot Framework, as well as deep learning tools like Azure Machine Learning, Visual Studio Code Tools for AI, and Cognitive Toolkit, visit AI School <aischool.microsoft.com/learning-paths>.
  9. proximagr
    Monitor & Alert for your Azure VM
    Lets see how easy it is to monitor and create an alert, in order to be notified when your VMs are restarted, when they start, stop, get high CPU usage, memory and much more.
    First navigate to the Azure Portal https://portal.azure.com, and then click the Monitor button.

    You will be navigated to the Monitor blade. At the center of the screen you will see three mail buttons, each starts a wizard.

    Click the “Create Alert” under the Explore monitoring essentials, the first of the three buttons.

    The create rule wizard will start. First you need to Select target.

    Select the subscription, at the Filter resource type select Virtual machines and select the VM from the Resource list.

    Once you press the target VM you will see a preview of the selection and the available signals.

    After the alert target, select the criteria

    At the configure signal login blade, select the signal from the list. I have selected the Restart Virtual Machine.

    Once you select the signal you can select the severity level and also you will see the preview of the condition.

    After that give a name and a description for the alert. Also select the resource group where the alert will be saved and if you want the alert to be enabled upon creation.

    The next step is to create an action group. The action group is the list of accounts to get the notifications when the alert is triggered. The notification can be email, SMS, Push Notifications and Voice call. You can add many action groups and many action in each group.


    Now the alert is ready. Once the alert is triggered you will be notified. At this example I added an email alert and once the VM restarted I received the following email:

    More Microsoft Azure guides at Apostolidis IT Corner
     
    [/url]
    The post Monitor & Alert for your Azure VM appeared first on Apostolidis IT Corner.


    Source
  10. proximagr
    Monitor & Alert for your Azure VM
    Lets see how easy it is to monitor and create an alert, in order to be notified when your VMs are restarted, when they start, stop, get high CPU usage, memory and much more.
    First navigate to the Azure Portal https://portal.azure.com, and then click the Monitor button.

    You will be navigated to the Monitor blade. At the center of the screen you will see three mail buttons, each starts a wizard.

    Click the “Create Alert” under the Explore monitoring essentials, the first of the three buttons.

    The create rule wizard will start. First you need to Select target.

    Select the subscription, at the Filter resource type select Virtual machines and select the VM from the Resource list.

    Once you press the target VM you will see a preview of the selection and the available signals.

    After the alert target, select the criteria

    At the configure signal login blade, select the signal from the list. I have selected the Restart Virtual Machine.

    Once you select the signal you can select the severity level and also you will see the preview of the condition.

    After that give a name and a description for the alert. Also select the resource group where the alert will be saved and if you want the alert to be enabled upon creation.

    The next step is to create an action group. The action group is the list of accounts to get the notifications when the alert is triggered. The notification can be email, SMS, Push Notifications and Voice call. You can add many action groups and many action in each group.


    Now the alert is ready. Once the alert is triggered you will be notified. At this example I added an email alert and once the VM restarted I received the following email:

    More Microsoft Azure guides at Apostolidis IT Corner
  11. proximagr
    Azure Storage | Static Web Site
    Το Microsoft Azure ανακοίνωσε την δυνατότητα να φιλοξενεί στατικές ιστοσελίδες απευθείας στο Blob Storage, με το κόστος του Blob Storage! Τι σημαίνει αυτό? Για 1 GB χώρο και 100000 views το κόστος είναι περίπου 0,05 ευρώ το μήνα!
    Στις στατικές ιστοσελίδες μπορούμε επίσης εκτός από στατικό περιεχόμενο να έχουμε και CLient Side Scripting οπως JavaScript αλλά όχι Server Side Scripting. Επίσης μπορούμε να δώσουμε και μια Custom σελίδα που θα γυρίζει αντί για 404.
    Μπορείτε να υπολογίσετε το κόστος με το Azure Prising Calculator Στο link https://azure.microsoft.com/en-us/pricing/calculator/

    Τι χρειαζόμαστε? απλά ένα Storage Account V2.

    Μόλις δημιουργηθεί το Storage Account, πρώτα ενεργοποιούμε το Static website από τα Settings του Storage Account. Μόλις πατήσουμε Save θα δημιουργηθεί ένα Virtual Directory με το όνομα $web. Το πατάμε για να μπούμε μέσα στο Blob για να ανεβάσουμε το περιεχόμενο μας. Επίσης σημειώνουμε το Primary endpoint γιατί είναι και το URL του Site μας.

    Για να ανεβάσουμε content στο $web Blob μπορούμε να χρησιμοποιήσουμε τον Storage Explorer

    και είμαστε έτοιμοι. Κάνουμε Browse στο URL του Static website, στο παρδειγμά μου είναι το https://proximagr.z6.web.core.windows.net/

    Φυσικά μπορούμε να βάλουμε το δικό μας Domain. Πρώτα φτιάχνουμε ένα CNAME που θα κάνει Point στο Endpoint και μετά πηγαίνουμε στο Custom Domain όπου δίνουμε το CNAME μας.

    και το αποτέλεσμα:

     
    [/url]
    The post Azure Storage | Static Web Site appeared first on Apostolidis IT Corner.


    Source
  12. proximagr
    Azure Storage | Static Web Site
    Το Microsoft Azure ανακοίνωσε την δυνατότητα να φιλοξενεί στατικές ιστοσελίδες απευθείας στο Blob Storage, με το κόστος του Blob Storage! Τι σημαίνει αυτό? Για 1 GB χώρο και 100000 views το κόστος είναι περίπου 0,05 ευρώ το μήνα!
    Στις στατικές ιστοσελίδες μπορούμε επίσης εκτός από στατικό περιεχόμενο να έχουμε και CLient Side Scripting οπως JavaScript αλλά όχι Server Side Scripting. Επίσης μπορούμε να δώσουμε και μια Custom σελίδα που θα γυρίζει αντί για 404.
    Μπορείτε να υπολογίσετε το κόστος με το Azure Prising Calculator Στο link https://azure.microsoft.com/en-us/pricing/calculator/

    Τι χρειαζόμαστε? απλά ένα Storage Account V2.

    Μόλις δημιουργηθεί το Storage Account, πρώτα ενεργοποιούμε το Static website από τα Settings του Storage Account. Μόλις πατήσουμε Save θα δημιουργηθεί ένα Virtual Directory με το όνομα $web. Το πατάμε για να μπούμε μέσα στο Blob για να ανεβάσουμε το περιεχόμενο μας. Επίσης σημειώνουμε το Primary endpoint γιατί είναι και το URL του Site μας.

    Για να ανεβάσουμε content στο $web Blob μπορούμε να χρησιμοποιήσουμε τον Storage Explorer

    και είμαστε έτοιμοι. Κάνουμε Browse στο URL του Static website, στο παρδειγμά μου είναι το https://proximagr.z6.web.core.windows.net/

    Φυσικά μπορούμε να βάλουμε το δικό μας Domain. Πρώτα φτιάχνουμε ένα CNAME που θα κάνει Point στο Endpoint και μετά πηγαίνουμε στο Custom Domain όπου δίνουμε το CNAME μας.

    και το αποτέλεσμα:

  13. proximagr
    Azure Start Point | Point-to-Site VPN
    In this post series we will go through some basic steps on how to start with Microsoft Azure. At this post we will see how we can create Point-to-Site VPN connection with Azure.
    If you don’t have an Azure Subscription, you can easily create a free trial by just going to https://azure.microsoft.com/en-us/free/
    Create typical a VIrtual Network

    In order to create Point-to-Site VPN connection it needs a Virtual Network Gateway. Go to the Virtual Network, Subnets and add a Gateway Subnet.

    FInally we can add the Virtual Network Gateway. From the portal, create a Virtual Network Gateway resource and add it to the previously created Virtual Network.

    The Virtual Network Gateway can take up to 45 minutes to be created.
    Once the Virtual Network Gateway is created we need one more step. To configure Point-to-site. Open the Virtual Network Gateway and press configure.

    We will need a root and a client self-signed certificate to complete the setup. Using a WIndows 10 or Windows Server 2016 machine we can make use of the New-SelfSignedCertificate cmdlet that makes the process easy. The whole process is described here: https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-certificates-point-to-site
    For the root certificate run the below PowerShell using ISE:
    $cert = New-SelfSignedCertificate -Type Custom -KeySpec Signature `-Subject "CN=prodevrootcert" -KeyExportPolicy Exportable `-HashAlgorithm sha256 -KeyLength 2048 `-CertStoreLocation "Cert:CurrentUserMy" -KeyUsageProperty Sign -KeyUsage CertSign
    For the client certificate run the below PowerShell using ISE:
    $cert = New-SelfSignedCertificate -Type Custom -KeySpec Signature `-Subject "CN=prodevrootcert" -KeyExportPolicy Exportable `-HashAlgorithm sha256 -KeyLength 2048 `-CertStoreLocation "Cert:CurrentUserMy" -KeyUsageProperty Sign -KeyUsage CertSign
    Export the root certificate public key in cer format using MMC, open the Certificates snap-in and select “current user”. Find the root certificate under Personal –> Certificates and right click –> All Tasks export

    Select to “not export the private key” and use Base64 encoded.

    Export the client certificate by selecting “export the private key” , select the “include all certificates in the certification path” and the “enable certificate privacy”. Add a password and export it to pfx file.

    this pfx file must be installed to all the client computers that will use this Point-to-Site connection.
    Now lets go back to the Point-to-Site configuration page. Add an address pool that the VPN clients will use. This subnet must be different from the Virtual Network address space.

    Then open the root certificate, the cer file, using notepad, copy the text between the Begin and End marks.

    Paste the certificate text to the “Root certificated” –> Public certificate data” field and add a name to the “Name” field.

    Press Save and the “Download VPN Client” button will be enabled and we can download the VPN client.
    In order to establish the VPN connection we need to install the VPN Client and the Client “pfx” certificate to the workstation.
    [/url]
    The post Azure Start Point | Point-to-Site VPN appeared first on Apostolidis IT Corner.


    Source
  14. proximagr
    Azure Start Point | Point-to-Site VPN
    In this post series we will go through some basic steps on how to start with Microsoft Azure. At this post we will see how we can create Point-to-Site VPN connection with Azure.
    If you don’t have an Azure Subscription, you can easily create a free trial by just going to https://azure.microsoft.com/en-us/free/
    Create typical a VIrtual Network

    In order to create Point-to-Site VPN connection it needs a Virtual Network Gateway. Go to the Virtual Network, Subnets and add a Gateway Subnet.

    FInally we can add the Virtual Network Gateway. From the portal, create a Virtual Network Gateway resource and add it to the previously created Virtual Network.

    The Virtual Network Gateway can take up to 45 minutes to be created.
    Once the Virtual Network Gateway is created we need one more step. To configure Point-to-site. Open the Virtual Network Gateway and press configure.

    We will need a root and a client self-signed certificate to complete the setup. Using a WIndows 10 or Windows Server 2016 machine we can make use of the New-SelfSignedCertificate cmdlet that makes the process easy. The whole process is described here: https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-certificates-point-to-site
    For the root certificate run the below PowerShell using ISE:
     



    1



    2



    3



    4



    $cert = New-SelfSignedCertificate -Type Custom -KeySpec Signature `
    -Subject "CN=prodevrootcert" -KeyExportPolicy Exportable `
    -HashAlgorithm sha256 -KeyLength 2048 `
    -CertStoreLocation "Cert:\CurrentUser\My" -KeyUsageProperty Sign -KeyUsage CertSign
    For the client certificate run the below PowerShell using ISE:
     



    1



    2



    3



    4



    $cert = New-SelfSignedCertificate -Type Custom -KeySpec Signature `
    -Subject "CN=prodevrootcert" -KeyExportPolicy Exportable `
    -HashAlgorithm sha256 -KeyLength 2048 `
    -CertStoreLocation "Cert:\CurrentUser\My" -KeyUsageProperty Sign -KeyUsage CertSign
    Export the root certificate public key in cer format using MMC, open the Certificates snap-in and select “current user”. Find the root certificate under Personal –> Certificates and right click –> All Tasks export

    Select to “not export the private key” and use Base64 encoded.

    Export the client certificate by selecting “export the private key” , select the “include all certificates in the certification path” and the “enable certificate privacy”. Add a password and export it to pfx file.

    this pfx file must be installed to all the client computers that will use this Point-to-Site connection.
    Now lets go back to the Point-to-Site configuration page. Add an address pool that the VPN clients will use. This subnet must be different from the Virtual Network address space.

    Then open the root certificate, the cer file, using notepad, copy the text between the Begin and End marks.

    Paste the certificate text to the “Root certificated” –> Public certificate data” field and add a name to the “Name” field.

    Press Save and the “Download VPN Client” button will be enabled and we can download the VPN client.
    In order to establish the VPN connection we need to install the VPN Client and the Client “pfx” certificate to the workstation.
  15. proximagr
    Azure Start Point | Your first Web App
    In this post series we will go through some basic steps on how to start with Microsoft Azure. For start we will create a Web App.
    If you don’t have an Azure Subscription, you can easily create a free trial by just going to https://azure.microsoft.com/en-us/free/
    Let’s create our first Web App. Go to the Azure Portal by navigating to https://portal.azure.com and click “+ Create a resource”

    At the search box write “Web App” and press enter

    At the search results. click the “Web App” and at the next screen just press “Create”

    The “Web App Create” wizard will open.Enter a name for the App. This will be the Public name of your App. Azure by default provides the domain *.azurewebsites.net for free.
    So in my example the prowebdev.azurewebsites.net will be the URL of my App
    Select the Azure Subscription that will used to bill the Web App and a Resource Group. The Resource Group is used to organize the resources and provide role based access control among other.
    OS: Select the Operating System platform that will host your Web App. This can be Windows, Linux or a Docker Container. For the test I will select Windows.
    As you can see the wizard has selected an App Service Plan by default with a random name and location. The App Service Plan is actually the Web Server that will host out Web App. Click on the “App Service Plan/Location”
    Add a name for the Web Server, select the Location that is nearest to you (or your clients) and the Pricing Tier.
    By pressing OK you will return to the Web App create wizard and press Create. Now you can monitor the creating process of the App form the “Notifications” option at the top right of the portal, it is the button that has a ringing bell image. First you will see the “Deployment in progress…” message and as soon as the App is ready you will see the “Deployment completed” message.
    Now if you go to the Resource group you will see two resources. The App Service and the App Service Plan. In high level, the App Service Plan is the web server and the App Service is the Web Application.

    Now click the App Service and at its blade you can see your applications URL.

    Click the URL and you will see the Demo page

    [/url]
    The post Azure Start Point | Your first Web App appeared first on Apostolidis IT Corner.


    Source
  16. proximagr
    Azure Start Point | Your first Web App
    In this post series we will go through some basic steps on how to start with Microsoft Azure. For start we will create a Web App.
    If you don’t have an Azure Subscription, you can easily create a free trial by just going to https://azure.microsoft.com/en-us/free/
    Let’s create our first Web App. Go to the Azure Portal by navigating to https://portal.azure.com and click “+ Create a resource”

    At the search box write “Web App” and press enter

    At the search results. click the “Web App” and at the next screen just press “Create”

    The “Web App Create” wizard will open. Enter a name for the App. This will be the Public name of your App. Azure by default provides the domain *.azurewebsites.net for free.

    So in my example the prowebdev.azurewebsites.net will be the URL of my App
    Select the Azure Subscription that will used to bill the Web App and a Resource Group. The Resource Group is used to organize the resources and provide role based access control among other.
    OS: Select the Operating System platform that will host your Web App. This can be Windows, Linux or a Docker Container. For the test I will select Windows.
    As you can see the wizard has selected an App Service Plan by default with a random name and location. The App Service Plan is actually the Web Server that will host out Web App. Click on the “App Service Plan/Location”
    Add a name for the Web Server, select the Location that is nearest to you (or your clients) and the Pricing Tier.
    By pressing OK you will return to the Web App create wizard and press Create. Now you can monitor the creating process of the App form the “Notifications” option at the top right of the portal, it is the button that has a ringing bell image. First you will see the “Deployment in progress…” message and as soon as the App is ready you will see the “Deployment completed” message.
    Now if you go to the Resource group you will see two resources. The App Service and the App Service Plan. In high level, the App Service Plan is the web server and the App Service is the Web Application.

    Now click the App Service and at its blade you can see your applications URL.

    Click the URL and you will see the Demo page

  17. proximagr
    Azure Blob Storage… Recycle Bin!!!!!!!
    Remember all that red alerts when comes to deleting blobs? Ah, forget them! Microsoft Azure brought the Windows Recycle Bin to Azure and named it Soft delete.
    The soft delete feature basically is similar to the Windows recycle bin. Deleting a file from the Windows explorer, the Operating System instead of actually removing the file it moves it to the recycle bin. The file stays there and it can be undeleted at any time. The soft delete feature in Microsoft Azure does the same thing for blob storage. When data is deleted or overwritten, the data is not actually gone. Instead, the data is soft deleted, thereby making it recoverable if necessary.
    It’s not enabled by default, but it’s very easy to enable it. Go to the Storage Account, scroll down to the Blob Service and select “Soft delete”. Select the Retention policy and Save, that’s all!

    Let’s delete and test. Browse a container and click the “Show delete blobs”. The current blob will show as active.

    deleting the blob it will change the status to “deleted”

    Click the three little dots and you can undelete, the blob, in Azure!!!

    Active again!

    Be careful, if you delete the whole container, the storage account or the Azure Subscription there is no return. The Soft delete feature is at blob level inside a container.
    For more deltails visit the docs: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-soft-delete
    [/url]
    The post Azure Blob Storage… Recycle Bin!!!!!!! appeared first on Apostolidis IT Corner.


    Source
  18. proximagr
    Azure Blob Storage… Recycle Bin!!!!!!!
    Remember all that red alerts when comes to deleting blobs? Ah, forget them! Microsoft Azure brought the Windows Recycle Bin to Azure and named it Soft delete.
    The soft delete feature basically is similar to the Windows recycle bin. Deleting a file from the Windows explorer, the Operating System instead of actually removing the file it moves it to the recycle bin. The file stays there and it can be undeleted at any time. The soft delete feature in Microsoft Azure does the same thing for blob storage. When data is deleted or overwritten, the data is not actually gone. Instead, the data is soft deleted, thereby making it recoverable if necessary.
    It’s not enabled by default, but it’s very easy to enable it. Go to the Storage Account, scroll down to the Blob Service and select “Soft delete”. Select the Retention policy and Save, that’s all!

    Let’s delete and test. Browse a container and click the “Show delete blobs”. The current blob will show as active.

    deleting the blob it will change the status to “deleted”

    Click the three little dots and you can undelete, the blob, in Azure!!!

    Active again!

    Be careful, if you delete the whole container, the storage account or the Azure Subscription there is no return. The Soft delete feature is at blob level inside a container.
    For more deltails visit the docs: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-soft-delete
  19. proximagr
    <h1 class="entry-title h1">Global Azure Bootcamp 2018 – Athens</h1>
    <p>This year I am very excited of being part of the organizers team of Global Azure Bootcamp 2018, Athens.</p>
    <p>This is a photo at the end of the event with all the Organizers, Speakers and Volunteers:</p>
    <p><img src="https://azureheadsassets.blob.core.windows.net/assets/2018/04/gab-126-990x526.jpg"/></p>
    <p>The day before the vent, the organizers, Kostas Pantos, Paris Polyzos and me preparing;</p>
    <p id="UUmNlfx"><img class="alignnone size-full wp-image-2089 " src="https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b9259bff3db1.png"alt="" srcset="https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b9259bff3db1.png 1296w, https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b9259bff3db1-300x173.png 300w, https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b9259bff3db1-768x444.png 768w, https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b9259bff3db1-1024x592.png 1024w, https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b9259bff3db1-600x347.png 600w" sizes="(max-width: 1296px) 100vw, 1296px" /></p>
    <p>Me and Paris Polizos, the two Azure MVPs of Greece:</p>
    <p id="mmabbEs"><img class="alignnone size-full wp-image-2088 " src="https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b92598f38ab2.png"alt="" srcset="https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b92598f38ab2.png 534w, https://www.e-apostolidis.gr/wp-content/uploads/2018/09/img_5b92598f38ab2-175x300.png 175w" sizes="(max-width: 534px) 100vw, 534px" /></p>
    <p>My presentation’s title was: <strong>Azure PaaS: Elasticity & Global Availability</strong></p>
    <p>And it is about how to have Resilient and Global Available apps using Microsoft Azure PaaS, that will keep alive even after a full Region failure.</p>
    <p>Feel free to download my presentation from here: <a href="https://aka.ms/GAB2018Presentation">https://aka.ms/GAB2018Presentation</a></p>
    <p>And the DEMO:</p>
    <p>Part1: <a href="https://aka.ms/GAB2018DEMOPart2">https://aka.ms/GAB2018DEMOPart2</a></p>
    <p>Part2: <a href="https://aka.ms/GAB2018DEMOPart01">https://aka.ms/GAB2018DEMOPart01</a></p>
    <p>More at the azureheads.gr blog: <a href="https://www.azureheads.gr/2018/04/global-azure-bootcamp-2018-athens-wrap-up/">https://www.azureheads.gr/2018/04/global-azure-bootcamp-2018-athens-wrap-up/</a></p>
    <p><a class="a2a_button_email" href="https://www.addtoany.com/add_to/email?linkurl=https%3A%2F%2Fwww.e-apostolidis.gr%2Fgeneral%2Fglobal-azure-bootcamp-2018-athens%2F&linkname=Global%20Azure%20Bootcamp%202018%20%E2%80%93%20Athens"title="Email" rel="nofollow noopener" target="_blank"></a><a class="a2a_button_print" href="https://www.addtoany.com/add_to/print?linkurl=https%3A%2F%2Fwww.e-apostolidis.gr%2Fgeneral%2Fglobal-azure-bootcamp-2018-athens%2F&linkname=Global%20Azure%20Bootcamp%202018%20%E2%80%93%20Athens" title="Print" rel="nofollow noopener" target="_blank"></a><a class="a2a_dd addtoany_share_save addtoany_share" href="https://www.addtoany.com/share#url=https%3A%2F%2Fwww.e-apostolidis.gr%2Fgeneral%2Fglobal-azure-bootcamp-2018-athens%2F&title=Global%20Azure%20Bootcamp%202018%20%E2%80%93%20Athens" data-a2a-url="https://www.e-apostolidis.gr/general/global-azure-bootcamp-2018-athens/" data-a2a-title="Global Azure Bootcamp 2018 – Athens"><img src="https://static.addtoany.com/buttons/share_save_171_16.png" alt="Share"></a></p><p>The post <a rel="nofollow" href="https://www.e-apostolidis.gr/general/global-azure-bootcamp-2018-athens/">Global Azure Bootcamp 2018 – Athens</a> appeared first on <a rel="nofollow" href="https://www.e-apostolidis.gr">Apostolidis IT Corner</a>.</p>


    <a href="https://www.e-apostolidis.gr/general/global-azure-bootcamp-2018-athens/"class='bbc_url' rel='nofollow external'>Source</a>
  20. proximagr
    Free e-book: Azure Strategy and Implementation Guide
    Microsoft Azure is giving free a Strategy and Implementation guide for Azure. This e-book provides guidance, architecture and advises to implement and integrate cloud technologies.
    This guide is directed to system administrators, cloud architects and project managers. It has for chapters, the Governance, the Architecture, the Application development and operations and the Service management.
    It will help you for starting with Azure or just doing a research regarding any cloud implementations.
    You can download your free copy from this link: https://azure.microsoft.com/en-us/resources/azure-strategy-and-implementation-guide/en-us/
    [/url]
    The post Free e-book: Azure Strategy and Implementation Guide appeared first on Apostolidis IT Corner.


    Source
  21. proximagr
    Free e-book: Azure Strategy and Implementation Guide
    Microsoft Azure is giving free a Strategy and Implementation guide for Azure. This e-book provides guidance, architecture and advises to implement and integrate cloud technologies.
    This guide is directed to system administrators, cloud architects and project managers. It has for chapters, the Governance, the Architecture, the Application development and operations and the Service management.
    It will help you for starting with Azure or just doing a research regarding any cloud implementations.
    You can download your free copy from this link: https://azure.microsoft.com/en-us/resources/azure-strategy-and-implementation-guide/en-us/
  22. proximagr
    Get early access to large disks support of Azure Backup & more
    Azure Backup’s 1TB limitation at last is over! Now you can backup VMs with disk sizes up to 4TB(4095GB), both managed and unmanaged. Also has improvements on backup and recovery performance that you can find here.
    Starting today login to the Portal, go to your Recovery Services vault and you will a notification saying “Support for >1TB disk VMs and improvements to backup and restore speed ->”

    Click the notification and the “Upgrade to new VM Backup stack” will open. Here click “Upgrade” to complete the upgrade.

    You can also upgrade all the Recovery Services vaults of a subscription using Azure PowerShell
    1. Select the subscription:
    Get-AzureRmSubscription –SubscriptionName "SubscriptionName" | Select-AzureRmSubscription
    2. Register this subscription for the upgrade:
    Register-AzureRmProviderFeature -FeatureName "InstantBackupandRecovery" –ProviderNamespace Microsoft.RecoveryServices
    [/url]
    The post Get early access to large disks support of Azure Backup & more appeared first on Apostolidis IT Corner.


    Source
  23. proximagr
    Get early access to large disks support of Azure Backup & more
    Azure Backup’s 1TB limitation at last is over! Now you can backup VMs with disk sizes up to 4TB(4095GB), both managed and unmanaged. Also has improvements on backup and recovery performance that you can find here.
    Starting today login to the Portal, go to your Recovery Services vault and you will a notification saying “Support for >1TB disk VMs and improvements to backup and restore speed ->”

    Click the notification and the “Upgrade to new VM Backup stack” will open. Here click “Upgrade” to complete the upgrade.

    You can also upgrade all the Recovery Services vaults of a subscription using Azure PowerShell
    1. Select the subscription:
     



    1



    Get-AzureRmSubscription –SubscriptionName "SubscriptionName" | Select-AzureRmSubscription
    2. Register this subscription for the upgrade:
     



    1



    Register-AzureRmProviderFeature -FeatureName "InstantBackupandRecovery" –ProviderNamespace Microsoft.RecoveryServic
  24. proximagr
    Azure App Service, get data from on-premises databases securely
    There are many scenarios where we want to have the Web Application on the Cloud but on the other hand, due to various limitations, the database stays on-premises. Azure has a service, called Azure Hybrid Connections, that allows the Web App to connect to on-premises databases, using internal IP address or the database server host name, without a complex VPN setup.
    The Connection diagram

    I have tested the connection with Microsoft SQL, PostgreSQL, MySQL, mongodb and Oracle. The databse requirements is to have a static port. So the first step in case of a Microsoft SQL instance is to assign a static port. In my test environment I have a Microsoft SQL 2016 and I assigned the default port 1433, using the Sql Server Configuration Manager / SQL Server Network Configuration / Protocols for INSTANCENAME (MSSQLSERVER)

    All paid service plans supports hybrid connections. The limits are on how many hybrid connections can be used per plan, as the below table shows.Pricing planNumber of Hybrid Connections usable in the planBasic5Standard25Premium200Isolated200
    To start creating the Hybrid Connections, go to the App Service / Networking / Hybrid Connections and press the “Configure your hybrid connection endpoints”

    At the Hybrid connections blade there are two steps, the first is to “Add hybrid connection” and the second is to “Download the connection manager”.

    First click the “Add hybrid connection” and then press “Create new hybrid connection”

    The “Create new hybrid connection” blade will open. Add a Hybrid connection name, this must be at least 6 characters and it is the display name of the connection. At the Endpoint host add the hostname of the database server and at the Endpoint port, the port of the database. At my case I added 1433, as this is the port I assign to my SQL instance before.
    Finally you will need to specify a name for a Servicebus namespace. As you realize, the hybrid connection uses Azure Servicebus for the communication, and press OK.

    Once the connection is created it will be shown at the portal as “Not connected”

    Now we need to download and install the hybrid connection manager by clicking the “Download connection manager”. For this test I will install the hybrid connection manager at the same server as the SQL database, but for a production environment it is recommended to install the hybrid connection manager to a different server that will have access to the database servers only to the required ports. For the best security install it to a DMZ server and open only the required ports to the database servers.
    Run the downloaded msi and just click Install.

    Open the “Hybrid connection manager” UI and press “Add a new Hybrid Connection.

    Sign in to your Azure account

    Once logged in, choose your Subscription and the hybrid connection configured previously will appear. Select it and press Save.

    Now at the connection manager status it will show “Connnected”

    The same at the Azure Portal and your Hybrid connection is ready.

    Test, test, test and proof of concept. Open the Console, form the Wep App Blade, and tcpping the SQL server’s hostname atthe port 1433

    and also sqlcmd

    [/url]
    The post Azure App Service, get data from on-premises databases securely appeared first on Apostolidis IT Corner.


    Source
  25. proximagr
    Azure App Service, get data from on-premises databases securely
    There are many scenarios where we want to have the Web Application on the Cloud but on the other hand, due to various limitations, the database stays on-premises. Azure has a service, called Azure Hybrid Connections, that allows the Web App to connect to on-premises databases, using internal IP address or the database server host name, without a complex VPN setup.
    The Connection diagram

    I have tested the connection with Microsoft SQL, PostgreSQL, MySQL, mongodb and Oracle. The databse requirements is to have a static port. So the first step in case of a Microsoft SQL instance is to assign a static port. In my test environment I have a Microsoft SQL 2016 and I assigned the default port 1433, using the Sql Server Configuration Manager / SQL Server Network Configuration / Protocols for INSTANCENAME (MSSQLSERVER)

    All paid service plans supports hybrid connections. The limits are on how many hybrid connections can be used per plan, as the below table shows. Pricing plan Number of Hybrid Connections usable in the plan Basic 5 Standard 25 Premium 200 Isolated 200
    To start creating the Hybrid Connections, go to the App Service / Networking / Hybrid Connections and press the “Configure your hybrid connection endpoints”

    At the Hybrid connections blade there are two steps, the first is to “Add hybrid connection” and the second is to “Download the connection manager”.

    First click the “Add hybrid connection” and then press “Create new hybrid connection”

    The “Create new hybrid connection” blade will open. Add a Hybrid connection name, this must be at least 6 characters and it is the display name of the connection. At the Endpoint host add the hostname of the database server and at the Endpoint port, the port of the database. At my case I added 1433, as this is the port I assign to my SQL instance before.
    Finally you will need to specify a name for a Servicebus namespace. As you realize, the hybrid connection uses Azure Servicebus for the communication, and press OK.

    Once the connection is created it will be shown at the portal as “Not connected”

    Now we need to download and install the hybrid connection manager by clicking the “Download connection manager”. For this test I will install the hybrid connection manager at the same server as the SQL database, but for a production environment it is recommended to install the hybrid connection manager to a different server that will have access to the database servers only to the required ports. For the best security install it to a DMZ server and open only the required ports to the database servers.
    Run the downloaded msi and just click Install.

    Open the “Hybrid connection manager” UI and press “Add a new Hybrid Connection.

    Sign in to your Azure account

    Once logged in, choose your Subscription and the hybrid connection configured previously will appear. Select it and press Save.

    Now at the connection manager status it will show “Connnected”

    The same at the Azure Portal and your Hybrid connection is ready.

    Test, test, test and proof of concept. Open the Console, form the Wep App Blade, and tcpping the SQL server’s hostname atthe port 1433

    and also sqlcmd

×
×
  • Create New...