Next Page: 10000

          .NET Developer MVC/Azure/webAPI/Paas      Cache   Translate Page      
Introductie: Voor een interessante klant in de techhub van het zuiden ben ik op zoek naar een freelance .NET Developer die het leuk vindt om met innoverende projecten en technieken te werken. Organisatie: Deze organisatie bestaat jaar en dag en wat ze altijd op de been heeft gehouden is hun innoverende karakter...
          Microsoft Azure hands-on specialist      Cache   Translate Page      
Korte omschrijving werkzaamheden: Onze opdrachtgever heeft een Cloud roadmap (IaaS/PaaS) ontwikkelt en geeft hier in 2018/2019 projectmatig invulling aan. Via een Agile/DevOps aanpak worden bijvoorlbeeld Azure Unmanaged Subscription / Monitoring / Hosting / Infra as a code beetgepakt. Binnen dit team zijn we op zoek naar een Hands-on Cloudspecialist die zijn kennis wil gebruiken om ons een grote stap voorwaarts te brengen...
          Młodzi piłkarze Cracovii pod specjalną opieką       Cache   Translate Page      
Akademia Mistrzów Cracovia robi duży krok do przodu. Młodzi piłkarze będą pod opieką Ego Power Lab. Mają być lepiej przygotowani do piłki seniorskiej i mniej podatni na kontuzje. Liderem partnera akademii jest Beata Mazurek, dietetyk sportowy z dużym doświadczeniem. Współpracuje z wieloma zawodnikami z różnych dyscyplin, m.in. koszykarkami...
          Comment on Azure DevOps Roadmap update for 2018 Q4 by Dotnetstep      Cache   Translate Page      
Azure devops has great features. I am using wiki many times and one thing I would like is export entire wiki to PDF.
          Going async with Azure and the PHP SDK for a massive performance boost      Cache   Translate Page      

Regular readers will know that I make extensive use of Azure Table Storage in both Report URI and Security Headers. As Report URI has grown and we’re now processing billions of reports per month for our users we’re always interested in performance or efficiency savings wherever possible. We


          The Reconstructive Challenges and Approach to Patients With Excoriation Disorder      Cache   Translate Page      
imageAbstract Many mental and emotional disorders have some variations of physical manifestations that are often the first definitive sign of disease. One such disorder is excoriation (skin-picking) disorder, also known as dermatillomania, acne excoriée, neurotic excoriation, or psychogenic excoriation. First identified in the dermatologic literature in 1920, excoriation disorder involves repetitive scratching behavior that sometimes accompanies pruritus and is often associated with depression, anxiety, and obsessive-compulsive disorder.1 In the latest edition of the Diagnostic and Statistical Manual of Mental Disorders, the Diagnostic and Statistical Manual of Mental Disorders fifth edition, excoriation or skin-picking disorder is listed as a stand-alone disorder associated with obsessive-compulsive disorder. In certain patients, the skin lesions are shallow and have adherent crusts that can be mistaken for acne. These lesions, once healed, may appear white and partially atrophic.2 Because these patients often initially present to dermatologists or plastic surgeons for their skin conditions rather than to psychiatric professionals, it is important to recognize the salient diagnostic features and to acknowledge the importance of a multidisciplinary approach to patient care and management. We present a case of a 51-year-old woman with excoriation disorder who required medical management by dermatology, neurosurgery, psychiatry, and plastic surgery for a definitive surgical treatment.
          Evaluating of Platelet-Rich Fibrin in the Treatment of Alveolar Cleft With Iliac Bone Graft By Means of Volumetric Analysis      Cache   Translate Page      
imageThe purpose of this study was to evaluate the efficiency of platelet-rich fibrin usage in the treatment of alveolar cleft with iliac bone graft by means of volumetric analysis. In this study, 22 patients with alveolar clefts—including 13 unilateral and 9 bilateral—were treated with anterior iliac crest bone grafts. Patients were divided into 2 groups as control (group A) and platelet-rich fibrin (group B) groups. Cone beam computed tomography (CBCT) scans were obtained preoperatively and 6 months postoperatively. Three-dimensional reconstructions of CBCT images were created by using Mimics software. Preoperative alveolar cleft volume and postoperative newly formed bone volume were assessed volumetrically. The percentages of preoperative alveolar cleft volume ranged from 0.51 to 2.04 cm3, with a mean volume of 0.98 ± 0.33 cm3. The percentages of newly formed bone in group B ranged from 50.70% to 80.09%, with a mean percentage of 68.21 ± 10.80%. In Group A, the percentages of bone formation ranged from 47.02% to 79.23%, with a mean percentage of 64.62 ± 9.49%. Platelet-rich fibrin can be used in the treatment of alveolar cleft with corticocancellous bone graft harvested from the anterior iliac crest, but in this study, there was no statistically significant difference between the groups for postoperative newly formed bone volume (P > 0.05).
          Surgical Management of the Recent Orbital War Injury      Cache   Translate Page      
imageMaxillofacial trauma affects sensitive and essential functions for the human being such as smell, breathing, talking, and the most importantly the sight. Trauma to the orbit may cause a vision loss especially if this trauma yields a high kinetic energy like that encountered during wars. The purpose of the study was to evaluate the surgical outcomes of the orbital war trauma, enriching the literature with the experience of the authors in this field. A total of 16 patients were injured, evacuated, and managed, between June 2014 and June 2017, from the fight between the Iraqi army and the Islamic State of Iraq and Syria (ISIS) in different areas of Iraq. Two-stage protocol was adopted, that is debridement and reconstruction. There were 14 military patients and 2 civilians. The cause of trauma was either bullet or shrapnel from an explosion. In the battlefield, delayed evacuation of the casualties led to increase the morbidity and mortality. Wearing a protective shield over the eye during the war along with fast evacuation highly improved the survival rates.
          Piezosurgery an Asset in Treatment of Pierre Robin Sequence      Cache   Translate Page      
imageAbstract Pierre Robin sequence (formerly a syndrome) is named after the French stomatologist who, in 1923 and 1934, described the problems associated with micrognathia in newborn. It comprises mandibular micrognathia, U-shaped cleft palate, and glossoptosis. The typical symptoms are hypoxaemia, noisy breathing, snoring, stridor, cyanosis, bradycardia, feeding difficulties, and failure to thrive. Distraction osteogenesis has recently been considered as a surgical option for early intervention to lengthen the mandible and relieve respiratory problems. Piezosurgery offers a modality to make precise bone cuts preserving vital structures, minimizing the invasiveness of the surgical procedure, and offering bloodless field. We present case of 1-year-old male malnourished child with Pierre Robin sequence and tracheostomy in situ since day 11 of his birth. The staged treatment plan involving mandibular lengthening in which mandibular osteotomies were performed with the piezoelectric scalpel followed by decannulation of tracheostomy, which has been described in detail in this article.
          Election Eve Wrangle      Cache   Translate Page      
Whatever happened to 'vote your hopes and dreams'?


There will be a blue wave, unless there isn't.  The red firewall will break the azure tsunami, unless Trump's hate spew has punched suburban holes in it.  The US House flips (but maybe not), and the Senate stays in Mitch McConnell's terrapin-like appendages, except maybe for a systematic polling error, a la 2016.

No wonder Team Donkey is experiencing some cognitive dissonance.



Who, or what, gets the blame if the Ds can't get it done tomorrow?  Voter suppression, from Georgia to North Dakota to Texas college campuses like Prairie View A&M and Texas State?  Voting machines flipping straight-ticket votes (to Ted Cruz?)  There will still be plenty of finger-pointing at Russian hackers and Green candidates, I feel certain.  Even if some dropped out of their race and endorsed the Democrat.

[The old Catch-22: "Greens should run in state and local races and build up to presidential races" instead of playing spoiler (sic) every four years.  "Greens should drop out and endorse Democrats because this is the most important election of our lifetime".  You know, since the one two years ago.  That was their fault Democrats lost.  Blah blah.]

There may be some less nefarious, more legitimate reasons the election will be won -- or lost; for example, the strength of women voters.  Notable for the demographers, moderate Republican women who live in suburban America turning out to cast their ballots against Trump and the GOP.  No, wait; it's the youth vote.  That's it *snaps fingers*, the children are our future.  Either is better than blaming the Latinxs, after all.  We're all tired of hearing that.

Hold on a minute: this is a midterm election, and Texas Democrats who haven't elected one of theirs since, you know, Jim Hightower was Ag Commissioner always lose because they can't raise any money for consultants, advisers, pollsters, etc.  Except they did, a shitpot full of dough, in 2018 -- at least those running for Congress; not so much the statewides save Congressman SuperBeto, whose massive Bernie-like ATM machine reversed both the prevailing Texas narrative and the cash flow, doubling the take of Senator Serpent Covered in Vaseline.


The Cult of RFO'R aims for the upset tomorrow evening.  Rumor has it happening.


So as President Shitler is fond of saying: we'll see what happens.  I'm ready it to be over; how about you?  Here's your roundup of lefty blog posts and news from the final week before E-Day.

==================

One unplumbed premise that the midterms might reveal is whether the strength of the Lone Star grassroots has shifted from one major party to the other, either because of 'outsiders' becoming 'insiders' or because there needs to be a "bad guy" to focus on and motivate the base.

Jim Henson, director of the Texas Politics Project at the University of Texas at Austin, says Democrats nationally — and in some parts of Texas — have unleashed the kind of intensity we used to see from the tea party.

“So the question of whether there is still that ability to motivate Republican voters on the other side is the big question going into this cycle,” Henson said.

Henson believes one reason the tea party’s galvanizing force has slipped in local and congressional races is that conservatives no longer have Barack Obama to target. And Donald Trump has taken over the role of chief agitator of conservatives.

Tea party-backed candidates have also been elected. In Texas, the movement has been changed by that success.

“I think once you have people who are part of institutions, it inevitably looks different, because you aren’t banging from the outside,” Henson said. “Like it or not, you are part of the status quo, and you are part of the establishment.”

Perhaps the only competitive statewide contest down the ballot shows signs of GOP panic, as indicted felon/AG Ken Paxton digs in to the deepest pockets of the friends he has left.

In addition to the TV ads, Paxton’s recent campaign finance filings have indicated that Republicans in high places are tuned in to the race in its home stretch. In recent days, the attorney general has received a $282,000 in-kind donation from Gov. Greg Abbott’s campaign; more than $350,000 in in-kind contributions from Texas for Lawsuit Reform, the political arm of the tort reform group; and $10,000 each from two of the biggest donors in the Republican Party: Sheldon and Miriam Adelson.

Millard Fillmore's Bathtub linked to a picture of Sweaty Beto, which may have been the Halloween costume of the year.


Stirred by Trump's call, armed militia groups head south to welcome the tired, poor, huddled masses yearning to breathe free intercept the invading (sic) migrant caravan.

Asked whether his group planned to deploy with weapons, McGauley laughed. “This is Texas, man,” he said.

Off the Kuff examined a pair of statewide judicial races.

In Harris County, the Texas Observer foresees a day of reckoning for Republican judges who have held fast to the money bail system, rewarding their friends and penalizing the poor.

Isiah Carey of Fox26 was first with the news that Houston mayor Sylvester Turner's first announced challenger next year will be former Democrat*, now (?) not-Trump Republican, non-DWI-convict and megawealthy trial lawyer -- Rick Perry's defense attorney, for those catching up -- Tony Buzbee.  *Lookie here, from Texpatriate:

(D)espite being the one-time Chairman of the Galveston County Democratic Party, a two-time Democrat nominee for the State Legislature and the once rumored Democratic candidate for Lieutenant Governor. However, of late, Buzbee has been appointed to the Board of Regents of his alma matter, Texas A&M University, and become a key financial supporter of both Perry and (Gov. Greg) Abbott.

Durrel Douglas at Houston Justice blogged the 2019 Houston City Council District B early line.  And in an excellent explainer, described how the local activist/consultant game -- getting paid to do politics, that is -- is a lot like having the app on your phone for the jukebox down at the local bar.

Socratic Gadfly, returning from a recent vacation, took a look at a major nature and environment issue that fired up up opposition to Trump — the Bears Ears downsizing — and offered his thoughts on the value of the original national monument site versus critics of several angles, and things that could make it even better.

Therese Odell at Foolish Watcher also leavens the politics with some Game of Thrones news.

And Harry Hamid's midnight tale from last week moves ahead to 1 a.m. (with no accounting for Daylight Savings Time and 'falling back' noted).

          How to build a web app using Python’s Flask and Google App Engine      Cache   Translate Page      

How to build a web app using Python’s Flask and Google App Engine

If you want to build web apps in a very short amount of time using python, then Flask is a fantastic option.

Flask is a small and powerful web framework (also known as “ microframework ”). It is also very easy to learn and simple to code. Based on my personal experience, it was easy to start as a beginner.

Before this project, my knowledge of Python was mostly limited to Data Science. Yet, I was able to build this app and create this tutorial in just a few hours.

In this tutorial, I’ll show you how to build a simple weather app with some dynamic content using an API. This tutorial is a great starting point for beginners. You will learn to build dynamic content from APIs and deploying it on Google Cloud.

The end product can be viewed here .


How to build a web app using Python’s Flask and Google App Engine
How to build a web app using Python’s Flask and Google App Engine

To create a weather app, we will need to request an API key from Open Weather Map . The free version allows up to 60 calls per minute, which is more than enough for this app. The Open Weather Map conditions icons are not very pretty. We will replace them with some of the 200+ weather icons from Erik Flowers instead.


How to build a web app using Python’s Flask and Google App Engine

This tutorial will also cover: (1) basic CSS design, (2) basic HTML with Jinja, and (3) deploying a Flask app on Google Cloud.

The steps we’ll take are listed below:

Step 0: Installing Flask (this tutorial doesn’t cover Python and PIP installation) Step 1: Building the App structure Step 2: Creating the Main App code with the API request Step 3: Creating the 2 pages for the App (Main and Result) with Jinja , HTML, and CSS Step 4: Deploying and testing on your local laptop Step 5: Deploying on Google Cloud. Step 0 ― Installing Flask and the libraries we will use in a virtual environment.

We’ll build this project using a virtual environment. But why do we need one?

With virtual environments, you create a local environment specific for each projects. You can choose libraries you want to use without impacting your laptop environment. As you code more projects on your laptop, each project will need different libraries. With a different virtual environment for each project, you won’t have conflicts between your system and your projects or between projects.

Run Command Prompt (cmd.exe) with administrator privileges. Not using admin privileges will prevent you from using pip.
How to build a web app using Python’s Flask and Google App Engine
(Optional) Install virtualenv and virtualenvwrapper-win with PIP. If you already have these system libraries, please jump to the next step. #Optional pip install virtualenvwrapper-win pip install virtualenv
How to build a web app using Python’s Flask and Google App Engine
Create your folder with the name “WeatherApp” and make a virtual environment with the name “venv” (it can take a bit of time) #Mandatory mkdir WeatherApp cd WeatherApp virtualenv venv
How to build a web app using Python’s Flask and Google App Engine
Activate your virtual environment with “call” on windows (same as “source” for linux). This step changes your environment from the system to the project local environment. call venv\Scripts\activate.bat
How to build a web app using Python’s Flask and Google App Engine
Create a requirements.txt file that includes Flask and the other libraries we will need in your WeatherApp folder, then save the file. The requirements file is a great tool to also keep track of the libraries you are using in your project. Flask==0.12.3
click==6.7
gunicorn==19.7.1
itsdangerous==0.24
Jinja2==2.9.6
MarkupSafe==1.0
pytz==2017.2
requests==2.13.0
Werkzeug==0.12.1
How to build a web app using Python’s Flask and Google App Engine
Install the requirements and their dependencies. You are now ready to build your WeatherApp. This is the final step to create your local environment. pip install -r requirements.txt
How to build a web app using Python’s Flask and Google App Engine
Step 1 ― Building the App structure

You have taken care of the local environment. You can now focus on developing your application. This step is to make sure the proper folder and file structure is in place. The next step will take care of the backend code.

Create two Python files (main.py, weather.py) and two folders (static with a subfolder img, templates).
How to build a web app using Python’s Flask and Google App Engine
Step 2 ― Creating the Main App code with the API request (Backend)

With the structure set up, you can start coding the backend of your application. Flask’s “Hello world” example only uses one Python file. This tutorial uses two files to get you comfortable with importing functions to your main app.

The main.py is the server that routes the user to the homepage and to the result page. The weather.py file creates a function with API that retrieves the weather data based on the city selected. The function populates the resulting page.

Edit main.py with the following code and save #!/usr/bin/env python from pprint import pprint as pp from flask import Flask, flash, redirect, render_template, request, url_for from weather import query_api app = Flask(__name__) @app.route('/')
def index():
return render_template(
'weather.html',
data=[{'name':'Toronto'}, {'name':'Montreal'}, {'name':'Calgary'},
{'name':'Ottawa'}, {'name':'Edmonton'}, {'name':'Mississauga'},
{'name':'Winnipeg'}, {'name':'Vancouver'}, {'name':'Brampton'},
{'name':'Quebec'}]) @app.route("/result" , methods=['GET', 'POST'])
def result():
data = []
error = None
select = request.form.get('comp_select')
resp = query_api(select)
pp(resp)
if resp:
data.append(resp)
if len(data) != 2:
error = 'Bad Response from Weather API'
return render_template(
'result.html',
data=data,
error=error) if __name__=='__main__': app.run(debug=True) Request a free API key on Open Weather Map
How to build a web app using Python’s Flask and Google App Engine
Edit weather.py with the following code (updating the API_KEY) and save from datetime import datetime
import os
import pytz
import requests
import math
API_KEY = 'XXXXXXXXXXXXXXXXXXXXXXXXXXX'
API_URL = ('http://api.openweathermap.org/data/2.5/weather?q={}&mode=json&units=metric&appid={}') def query_api(city): try: print(API_URL.format(city, API_KEY)) data = requests.get(API_URL.format(city, API_KEY)).json() except Exception as exc: print(exc) data = None return data Step 3 ― Creating pages with Jinja , HTML, and CSS (Frontend)

This step is about creating what the user will see.

The HTML pages weather and result are the one the backend main.py will route to and give the visual structure. The CSS file will bring the final touch. There is no javascript in this tutorial (the front end is pure HTML and CSS).

It was my first time using the Jinja2 template library to populate the HTML file. It surprised me how easy it was to bring dynamic images or use functions (e.g. rounding weather). Definitely a fantastic template engine.

Create the first HTML file in the templates folder (weather.html) <!doctype html> <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}"> <div class="center-on-page"> <h1>Weather in a City</h1> <form class="form-inline" method="POST" action="{{ url_for('result') }}"> <div class="select"> <select name="comp_select" class="selectpicker form-control"> {% for o in data %} <option value="{{ o.name }}">{{ o.name }}</option> {% endfor %} </select> </div> <button type="submit" class="btn">Go</button> </form> Create the second HTML file in the templates folder (result.html) <!doctype html> <link rel="stylesheet" type="text/css" href="{{ url_for('static', filename='style.css') }}"> <div class="center-on-page"> {% for d in data %} {% set my_string = "static/img/" + d['weather'][0]['icon']+ ".svg" %} <h1> <img src="{{ my_string }}" class="svg" fill="white" height="100" vertical-align="middle" width="100"> </h1> <h1>Weather</h1> <h1>{{ d['name'] }}, {{ d['sys']['country'] }}</h1> <h1>{{ d['main']['temp']|round|int}} °C</h1> {% endfor %}
How to build a web app using Python’s Flask and Google App Engine
Add a CSS file in the static folder (style.css) body { color: #161616; font-family: 'Roboto', sans-serif; text-align: center; background-color: currentColor; } .center-on-page { position: absolute; top:50%; left: 50%; transform: translate(-50%,-50%); } h1 { text-align: center; color:#FFFFFF; } img { vertical-align: middle; } /* Reset Select */ select { -webkit-appearance: none; -moz-appearance: none; -ms-appearance: none; appearance: none; outline: 0; box-shadow: none; border: 0 !important; background: #2c3e50; background-image: none; } /* Custom Select */ .select { position: relative; display: block; width: 20em; height: 3em; line-height: 3; background: #2c3e50; overflow: hidden; border-radius: .25em; } select { width: 100%; height: 100%; margin: 0; padding: 0 0 0 .5em; color: #fff; cursor: pointer; } select::-ms-expand { display: none; } /* Arrow */ .select::after { content: '\25BC'; position: absolute; top: 0; right: 0; bottom: 0; padding: 0 1em; background: #34495e; pointer-events: none; } /* Transition */ .select:hover::after { color: #f39c12; } .select::after { -webkit-transition: .25s all ease; -o-transition: .25s all ease; transition: .25s all ease; } button{ -webkit-appearance: none; -moz-appearance: none; -ms-appearance: none; appearance: none; outline: 0; box-shadow: none; border: 0 !important; background: #2c3e50; background-image: none; width: 100%; height: 40px; margin: 0; margin-top: 20px; color: #fff; cursor: pointer; border-radius: .25em; } .button:hover{ color: #f39c12; } Download the images in the img subfolder in static

Link with the images on Github :


How to build a web app using Python’s Flask and Google App Engine
How to build a web app using Python’s Flask and Google App Engine
Step 4 ― Deploying and testinglocally

At this stage, you have set up the environment, the structure, the backend, and the frontend. The only thing left is to launch your app and to enjoy it on your localhost.

Just launch the main.py with Python python main.py Go to the localhost link proposed on cmd with your Web Browser (Chrome, Mozilla, etc.). You should see your new weather app live on your local laptop:)
How to build a web app using Python’s Flask and Google App Engine
How to build a web app using Python’s Flask and Google App Engine
Step 5 ― Deploying on GoogleCloud

This last step is for sharing your app with the world. It’s important to note that there are plenty of providers for web apps built using Flask. Google Cloud is just one of many. This article does not cover some of the others like AWS, Azure, Heroku…

If the community is interested, I can provide the steps of the other cloud providers in another article and some comparison (pricing, limitations, etc.).

To deploy your app on Google Cloud you will need to 1) Install the SDK, 2) Create a new project, 3) Create 3 local files, 4) Deploy and test online.

Install the SDK following Google’s instructions Connect to your Google Cloud Account (use a $300 coupon if you haven’t already) Create a new project and save the project id (wait a bit until the new project is provisioned)
How to build a web app using Python’s Flask and Google App Engine
How to build a web app using Python’s Flask and Google App Engine
Create an app.yaml file in your main folder with the following code: runtime: python27 api_version: 1 threadsafe: true handlers: - url: /static static_dir: static - url: /.* script: main.app libraries: - name: ssl version: latest Create an appengine_config.py file in your main folder with the following code: from google.appengine.ext import vendor
          Premium blob storage      Cache   Translate Page      

As a follow-up to my blog Azure Archive Blob Storage, Microsoft has released another storage tier called Azure Premium Blob Storage (announcement).  It is in private preview in US East 2, US Central and US West regions.

This is a performance tier in Azure Blob Storage, complimenting the existing Hot, Cool, and Archive tiers.  Data in Premium Blob Storage is stored on solid-state drives, which are known for lower latency and higher transactional rates compared to traditional hard drives.

It is ideal for workloads that require very fast access time such as interactive video editing, static web content, and online transactions.  It also works well for workloads that perform many relatively small transactions, such as capturing telemetry data, message passing, and data transformation.

Microsoft internal testing shows that both average and 99th percentile server latency is significantly better than the Hot access tier, providing faster and more consistent response times for both read and write across a range of object sizes.

Premium Blob Storage is available with Locally-Redundant Storage and comes with High-Throughput Block Blobs (HTBB), which provides a) improved write throughput when ingesting larger block blobs, b) instant write throughput, and c) container and blob names have no effect on throughput.

You can store block blobs and append blobs in Premium Blob Storage (page blobs is not yet available).  To use Premium Blob Storage you provision a new ‘Block Blob’ storage account in your subscription and start creating containers and blobs using the existing Blob Service REST API and/or any existing tools such as AzCopy or Azure Storage Explorer.

Premium Blob Storage has higher data storage cost, but lower transaction cost compared to data stored in the regular Hot tier.  This makes it cost effective and can be less expensive for workloads with very high transaction rates.  Check out the pricing page for more details.

At present data stored in Premium cannot be tiered to Hot, Cool or Archive access tiers.  Microsoft is working on supporting object tiering in the future.  To move data, you can synchronously copy blobs from using the new PutBlockFromURL API (sample code) or a version of AzCopy that supports this API.  PutBlockFromURL synchronously copies data server side, which means that the data has finished copying when the call completes and all data movement happens inside Azure Storage.

Check out the announcement on how to signup for the private preview.

To summarize the four storage tiers:

  • Premium storage (preview) provides high performance hardware for data that is accessed frequently
  • Hot storage: is optimized for storing data that is accessed frequently
  • Cool storage is optimized for storing data that is infrequently accessed and stored for at least 30 days
  • Archive storage is optimized for storing data that is rarely accessed and stored for at least 180 days with flexible latency requirements (on the order of hours)
#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000
          Telecommute Associate Incident Response Analyst      Cache   Translate Page      
A technology company needs applicants for an opening for a Telecommute Associate Incident Response Analyst. Candidates will be responsible for the following: Monitoring, identifying, investigating, and analyzing all response activities Identifying security flaws and vulnerabilities Responding to cybersecurity incidents, conduct threat analysis as directed Must meet the following requirements for consideration: Familiar with the challenges of processing security events Familiarity with log formats and host or network based intrusion detection systems Knowledge of vulnerability management using tools like Nessus Knowledge security SIEMs like Microsoft Azure ATP or SecurityCenter
          SSIS Best Online Training (KOLKATA)      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learning Training, Oracle PL SQL Training. We have been providing Classroom Training, Live-Online Training, On Demand Video Training and Corporate trainings. All our training sessions are COMPLETELY PRACTICAL. SSIS COURSE DETAILS - FOR ONLINE TRAINING: SQL ...
          Azure File Sync      Cache   Translate Page      
Bilişim sektörünün her zaman öncelikli iş yükleri vardır. Bunların en başında güvenlik gelmesine karşın bunun ile ilişkili olarak yedekleme başlığı da belki güvenlik işe eş değer bir konudur. Bunun en temel nedeni ise her ne kadar en yeni nesil güvenlik ürünlerini veya teknolojilerini kullanıyorsanız dahi olsa günün sonunda zero day başta olmak üzere bir şekilde

Read More…


          .NET Developer MVC/Azure/webAPI/Paas      Cache   Translate Page      
Introductie: Voor een interessante klant in de techhub van het zuiden ben ik op zoek naar een freelance .NET Developer die het leuk vindt om met innoverende projecten en technieken te werken. Organisatie: Deze organisatie bestaat jaar en dag en wat ze altijd op de been heeft gehouden is hun innoverende karakter...
          BCS Technology's Blockchain Solution Now Available In The Microsoft Azure Marketplace      Cache   Translate Page      


          沃尔玛为微软Azure带来了“数千个”内部应用程序      Cache   Translate Page      
并非所有微软的“合作伙伴”都是一样的。最近,微软宣布与沃尔玛结成“合作伙伴关系”,但实际上沃尔玛是他们的客户。
          Moovit partners with Microsoft to provide public transit data for Azure Maps      Cache   Translate Page      
Mobility data startup Moovit announced today that it is integrating its people-driven transportation platform into Microsoft's Azure Maps.
          Engineer, R & D (Software) - Venture International Pte Ltd - Ang Mo Kio      Cache   Translate Page      
I.e. AWS, Azure or Google Cloud. Web APIs such as google map API etc. Design and develop mobile, PC applications (may include full stack) for use as companion...
From Venture Corporation Limited - Fri, 13 Jul 2018 11:23:30 GMT - View all Ang Mo Kio jobs
          Software Developer - Jobline Resources Pte Ltd - Paya Lebar      Cache   Translate Page      
Familiar with cloud environment – AWS, Azure, Google Cloud, principals of, and execution in, DevOps. The development environment consists of AWS Cloud and...
From Jobline Resources Pte Ltd - Fri, 02 Nov 2018 09:24:59 GMT - View all Paya Lebar jobs
          Software Developer – Cloud (PaaS/Full stack) - Genetec - Montréal, QC      Cache   Translate Page      
Azure, AWS, GCS. Are you a real team player and have a great learning ability in a fast-paced working environment?...
From Genetec - Fri, 02 Nov 2018 18:05:53 GMT - View all Montréal, QC jobs
          Multicenter Validation of the Vasoactive-Ventilation-Renal Score as a Predictor of Prolonged Mechanical Ventilation After Neonatal Cardiac Surgery*      Cache   Translate Page      
imageObjectives: We sought to validate the Vasoactive-Ventilation-Renal score, a novel disease severity index, as a predictor of outcome in a multicenter cohort of neonates who underwent cardiac surgery. Design: Retrospective chart review. Setting: Seven tertiary-care referral centers. Patients: Neonates defined as age less than or equal to 30 days at the time of cardiac surgery. Interventions: Ventilation index, Vasoactive-Inotrope Score, serum lactate, and Vasoactive-Ventilation-Renal score were recorded for three postoperative time points: ICU admission, 6 hours, and 12 hours. Peak values, defined as the highest of the three measurements, were also noted. Vasoactive-Ventilation-Renal was calculated as follows: ventilation index + Vasoactive-Inotrope Score + Δ creatinine (change in creatinine from baseline × 10). Primary outcome was prolonged duration of mechanical ventilation, defined as greater than 96 hours. Receiver operative characteristic curves were generated, and abilities of variables to correctly classify prolonged duration of mechanical ventilation were compared using area under the curve values. Multivariable logistic regression modeling was also performed. Measurements and Main Results: We reviewed 275 neonates. Median age at surgery was 7 days (25th–75th percentile, 5–12 d), 86 (31%) had single ventricle anatomy, and 183 (67%) were classified as Society of Thoracic Surgeons-European Association for Cardio-Thoracic Surgery Congenital Heart Surgery Mortality Category 4 or 5. Prolonged duration of mechanical ventilation occurred in 89 patients (32%). At each postoperative time point, the area under the curve for prolonged duration of mechanical ventilation was significantly greater for the Vasoactive-Ventilation-Renal score as compared to the ventilation index, Vasoactive-Inotrope Score, and serum lactate, with an area under the curve for peak Vasoactive-Ventilation-Renal score of 0.82 (95% CI, 0.77–0.88). On multivariable analysis, peak Vasoactive-Ventilation-Renal score was independently associated with prolonged duration of mechanical ventilation, odds ratio (per 1 unit increase): 1.08 (95% CI, 1.04–1.12). Conclusions: In this multicenter cohort of neonates who underwent cardiac surgery, the Vasoactive-Ventilation-Renal score was a reliable predictor of postoperative outcome and outperformed more traditional measures of disease complexity and severity.
          Hemolysis During Pediatric Extracorporeal Membrane Oxygenation: Associations With Circuitry, Complications, and Mortality      Cache   Translate Page      
imageObjectives: To describe factors associated with hemolysis during pediatric extracorporeal membrane oxygenation and the relationships between hemolysis, complications, and mortality. Design: Secondary analysis of data collected prospectively by the Collaborative Pediatric Critical Care Research Network between December 2012 and September 2014. Setting: Three Collaborative Pediatric Critical Care Research Network-affiliated hospitals. Patients: Age less than 19 years and treated with extracorporeal membrane oxygenation. Interventions: None. Measurements and Main Results: Hemolysis was defined based on peak plasma free hemoglobin levels during extracorporeal membrane oxygenation and categorized as none (< 0.001 g/L), mild (0.001 to < 0.5 g/L), moderate (0.5 to < 1.0 g/L), or severe (≥ 1.0 g/L). Of 216 patients, four (1.9%) had no hemolysis, 67 (31.0%) had mild, 51 (23.6%) had moderate, and 94 (43.5%) had severe. On multivariable analysis, variables independently associated with higher daily plasma free hemoglobin concentration included the use of in-line hemofiltration or other continuous renal replacement therapy, higher hemoglobin concentration, higher total bilirubin concentration, lower mean heparin infusion dose, lower body weight, and lower platelet count. Using multivariable Cox modeling, daily plasma free hemoglobin was independently associated with development of renal failure during extracorporeal membrane oxygenation (defined as creatinine > 2 mg/dL [> 176.8 μmol/L] or use of in-line hemofiltration or continuous renal replacement therapy) (hazard ratio, 1.04; 95% CI, 1.02–1.06; p < 0.001), but not mortality (hazard ratio, 1.01; 95% CI, 0.99–1.04; p = 0.389). Conclusions: Hemolysis is common during pediatric extracorporeal membrane oxygenation. Hemolysis may contribute to the development of renal failure, and therapies used to manage renal failure such as in-line hemofiltration and other forms of continuous renal replacement therapy may contribute to hemolysis. Hemolysis was not associated with mortality after controlling for other factors. Monitoring for hemolysis should be a routine part of extracorporeal membrane oxygenation practice, and efforts to reduce hemolysis may improve patient care.
          Pathobiology of Acute Respiratory Distress Syndrome      Cache   Translate Page      
imageThe unique characteristics of pulmonary circulation and alveolar-epithelial capillary-endothelial barrier allow for maintenance of the air-filled, fluid-free status of the alveoli essential for facilitating gas exchange, maintaining alveolar stability, and defending the lung against inhaled pathogens. The hallmark of pathophysiology in acute respiratory distress syndrome is the loss of the alveolar capillary permeability barrier and the presence of protein-rich edema fluid in the alveoli. This alteration in permeability and accumulation of fluid in the alveoli accompanies damage to the lung epithelium and vascular endothelium along with dysregulated inflammation and inappropriate activity of leukocytes and platelets. In addition, there is uncontrolled activation of coagulation along with suppression of fibrinolysis and loss of surfactant. These pathophysiological changes result in the clinical manifestations of acute respiratory distress syndrome, which include hypoxemia, radiographic opacities, decreased functional residual capacity, increased physiologic deadspace, and decreased lung compliance. Resolution of acute respiratory distress syndrome involves the migration of cells to the site of injury and re-establishment of the epithelium and endothelium with or without the development of fibrosis. Most of the data related to acute respiratory distress syndrome, however, originate from studies in adults or in mature animals with very few studies performed in children or juvenile animals. The lack of studies in children is particularly problematic because the lungs and immune system are still developing during childhood and consequently the pathophysiology of pediatric acute respiratory distress syndrome may differ in significant ways from that seen in acute respiratory distress syndrome in adults. This article describes what is known of the pathophysiologic processes of pediatric acute respiratory distress syndrome as we know it today while also presenting the much greater body of evidence on these processes as elucidated by adult and animal studies. It is also our expressed intent to generate enthusiasm for larger and more in-depth investigations of the mechanisms of disease and repair specific to children in the years to come.
          Implementation of a Risk-Stratified Opioid and Benzodiazepine Weaning Protocol in a Pediatric Cardiac ICU      Cache   Translate Page      
imageObjectives: Opioids and benzodiazepines are commonly used to provide analgesia and sedation for critically ill children with cardiac disease. These medications have been associated with adverse effects including delirium, dependence, withdrawal, bowel dysfunction, and potential neurodevelopmental abnormalities. Our objective was to implement a risk-stratified opioid and benzodiazepine weaning protocol to reduce the exposure to opioids and benzodiazepines in pediatric patients with cardiac disease. Design: A prospective pre- and postinterventional study. Patients: Critically ill patients less than or equal to 21 years old with acquired or congenital cardiac disease exposed to greater than or equal to 7 days of scheduled opioids ± scheduled benzodiazepines between January 2013 and February 2015. Setting: A 24-bed pediatric cardiac ICU and 21-bed cardiovascular acute ward of an urban stand-alone children’s hospital. Intervention: We implemented an evidence-based opioid and benzodiazepine weaning protocol using educational and quality improvement methodology. Measurements and Main Results: One-hundred nineteen critically ill children met the inclusion criteria (64 post intervention, 55 pre intervention). Demographics and risk factors did not differ between groups. Patients in the postintervention period had shorter duration of opioids (19.0 vs 30.0 d; p < 0.01) and duration of benzodiazepines (5.3 vs 22.7 d; p < 0.01). Despite the shorter duration of wean, there was a decrease in withdrawal occurrence (% Withdrawal Assessment Tool score ≥ 4, 4.9% vs 14.1%; p < 0.01). There was an 8-day reduction in hospital length of stay (34 vs 42 d; p < 0.01). There was a decrease in clonidine use (14% vs 32%; p = 0.02) and no change in dexmedetomidine exposure (59% vs 75%; p = 0.08) in the postintervention period. Conclusions: We implemented a risk-stratified opioid and benzodiazepine weaning protocol for critically ill cardiac children that resulted in reduction in opioid and benzodiazepine duration and dose exposure, a decrease in symptoms of withdrawal, and a reduction in hospital length of stay.
          The Association Between the Functional Status Scale and the Pediatric Functional Independence Measure in Children Who Survive Traumatic Brain Injury*      Cache   Translate Page      
imageObjectives: To determine the association between the Functional Status Scale and Pediatric Functional Independence Measure scores during the rehabilitation stay in children who survive traumatic brain injury. Design: Secondary analysis of a prospective observational cohort study. Setting: Tertiary care children’s hospital with a level 1 trauma center and inpatient rehabilitation service. Patients: Sixty-five children less than 18 years old admitted to an ICU with acute traumatic brain injury and subsequently transferred to the inpatient rehabilitation service. Interventions: Not applicable. Measurements and Main Results: Functional Status Scale and Pediatric Functional Independence Measure at transfer to rehabilitation and Pediatric Functional Independence Measure at discharge from rehabilitation. The median age of the cohort was 7.1 years (interquartile range, 0.8–12.3 yr), and 29% were female. Nearly all of the children were healthy prior to the traumatic brain injury: six patients (9.2%) had a baseline Functional Status Scale score greater than 6. At the time of transfer to inpatient rehabilitation, total Functional Status Scale and Pediatric Functional Independence Measure scores had the expected negative correlation due to increasing disability resulting in lower scores in Pediatric Functional Independence Measure and higher scores in Functional Status Scale (r = –0.49; 95% CI, –0.62 to –0.35). Among subjects with less disability as measured by lower total Functional Status Scale scores, we found substantial variability in the total Pediatric Functional Independence Measure scores. In contrast, Pediatric Functional Independence Measure scores were consistently low among subjects with a wide range of higher total Functional Status Scale scores (more disability). Conclusions: Although proprietary and more time-intensive, the Pediatric Functional Independence Measure has advantages relative to the Functional Status Scale for less severely injured patients and task-specific measurements. The Functional Status Scale may have advantages relative to the Pediatric Functional Independence Measure for more severely injured patients. Further investigations are needed to characterize changes in the Functional Status Scale during the rehabilitation stay and after discharge.
          Abstract PCCLB-23: CLUSTER OF ACUTE FLACCID MYELITIS ASSOCIATED WITH ENTEROVIRUS D68 (EV-D68) IN FIVE CHILDREN IN SOUTH EAST SCOTLAND, SEPTEMBER-OCTOBER 2016      Cache   Translate Page      
imageNo abstract available
          Microsoft Azure Engineer / Architect - NTT DATA Services - Montpelier, VT      Cache   Translate Page      
NTT DATA, Inc. About NTT DATA Services. NTT DATA Services, headquartered in Plano, Texas, is a division of NTT DATA Corporation, a top 10 global business and IT...
From NTT Data - Fri, 28 Sep 2018 20:14:23 GMT - View all Montpelier, VT jobs
          Solution Designer/ Architect (.NET/Azure) - EY - Alpharetta, GA      Cache   Translate Page      
Experience with UI automation tools and testing (including multi-browser and multi-device). Join our Core Business Services (CBS) team and you will help support...
From EY - Thu, 20 Sep 2018 00:34:04 GMT - View all Alpharetta, GA jobs
          SCHÖNES ROTES BRAUTKLEID NEU      Cache   Translate Page      

SCHÖNES ROTES BRAUTKLEID NEU . BEI EINZAHLUNG AM SCHALTER, BITTE 1.50 FÜR SPESEN DAZURECHNEN, DANKE!! BEACHTEN SIE AUCH MEINE AN...
CHF 3.90


          Nights of Azure 2 sur Nintendo Switch (vendeur tiers)      Cache   Translate Page      
20,59€ - Darty
Tres bon prix pour ce jeu a 20,59 € sortie l'année derniere , chez darty en ligne uniquement , (dispo aussi a micromania en magasin physique pour 29.90 au lieu des 64.90 habituel ,temporairement sur ps4 et switch ) voire photo ( darty en haut micromania en bas )

1342935.jpg1342935.jpg
  • description du jeu

-Nights of Azure 2: Bride of the New Moon se déroule dans une ville Européenne fictive infestée de démons, vers la fin du 19ème siècle. Vous incarnez une guerrière appelée Alushe et ses deux amies d’enfance, Liliana, une prêtresse au grand cœur, et Ruhenheid, une guerrière sainte de l’ordre de Lourdes.

Alors qu’elle veillait sur Liliana, Alushe est prise en embuscade et assassinée. Mais celle-ci s’éveille alors en tant que mi-démon artificiel, entre les mains de New Curia, une organisation religieuse au passé sombre.

Accompagnée d’alliés improbables avec leur propre histoire et leurs propres desseins, Alushe est mue par le désir de sauver Liliana. Elle trouve alors la force de repousser les ténèbres qui enveloppent le monde et la motivation pour lever le voile autour de la mystérieuse Reine de la Lune.



Caractéristiques :

- Deux nouveaux types de Servan : le Striker se transforme en arme pour une courte durée afin d’infliger plus de dommages au cours de vos attaques, tandis que le Tricker permet de déclencher des actions spéciales, comme la faculté de planer au-dessus du vide.
- Au cours de son périple, Alushe rencontrera des compagnons appelés “Lilies”, dont le rôle sera déterminant au combat grâce à d’incroyables attaques combinées. A mesure que leurs liens se renforcent de nouvelles capacités et évènements font leur apparition.

- De tous nouveaux combats de boss mettront les capacités de Alushe et de ses Lilies à rude épreuve ! Etudiez les points faibles de vos ennemis et attaquez-les quand ils sont affaiblis pour leur infliger des attaques dévastatrices.
          Do unit and post-deployment social support influence the association between deployment sexual trauma and suicidal ideation? - Monteith LL, Hoffmire CA, Holliday R, Park CL, Mazure CM, Hoff RA.       Cache   Translate Page      
Deployment sexual trauma is associated with post-deployment suicidal ideation. No studies have examined the role of social support in this association. The present study examined if perceived unit support and post-deployment support influenced the associat...
          IoT for Smart Cities: New partnerships for Azure Maps and Azure Digital Twins      Cache   Translate Page      
Over recent years, one of the most dynamic landscapes undergoing digital transformation is the modern city. Amid increasing urbanization, cities must grapple with questions around how to strengthen their local economies, manage environmental resources, mitigate pollution and create safer, more accessible societies.
          Mission critical performance with Ultra SSD for SQL Server on Azure VM      Cache   Translate Page      
We recently published Storage Configuration Guidelines for SQL Server on Azure VM summarizing the test findings from running TPC-E profile test workloads on premium storage configuration options. We continued this testing by including Ultra SSD.
          Microsoft Azure portal November 2018 update      Cache   Translate Page      
In October 2018, we started a monthly blog series to help you find everything that is new in the Microsoft Azure portal and the Azure mobile app in one place. We are constantly working to make it easier for you to manage your Azure environment, and we want you to be able to stay up to speed with everything that’s new.
          Run your LOB applications with PostgreSQL powered by the plv8 extension      Cache   Translate Page      
We are extremely excited to announce that the plv8 extension for PostgreSQL is now enabled in all generally available regions of the Azure Database for PostgreSQL service. The plv8 extension was one of the highly requested UserVoice asks from our growing customer base and the PostgreSQL community.
          Azure SQL Database and Azure Database for MySQL at PASS SUMMIT!      Cache   Translate Page      
Get inspired by our industry-leading keynote speakers on the Microsoft data platform at PASS Summit 2018! There has never been a more exciting time for data professionals and developers as more organizations turn to data-driven insights to stay ahead and prepare for the future.
          Best practices for alerting on metrics with Azure Database for PostgreSQL monitoring      Cache   Translate Page      
Whether you are a developer, database administrator, site reliability engineer, or a DevOps professional at your company, monitoring databases is an important part of maintaining the reliability, availability, and performance of your PostgreSQL server.
          Best practices for alerting on metrics with Azure Database for MySQL monitoring      Cache   Translate Page      
Whether you are a developer, database administrator, site reliability engineer, or a DevOps professional at your company, monitoring databases is an important part of maintaining the reliability, availability, and performance of your PostgreSQL server.
          Azure.Source - Volume 56      Cache   Translate Page      
Deliver the right events to the right places with Event Domains, Avere vFXT for Azure for HPC workloads now generally available, Design patterns – IoT and aggregation, and more. Catch up on Azure in one post.
          AWS Apps Developer      Cache   Translate Page      
TX-Dallas, We’re hiring an AWS Apps Dev to build serverless architecture platforms. Azure experience can be a substitute for AWS depending on your projects. Required Experience/Qualifications: Bachelor’s or Graduate Degree in Computer Science, Engineering, or similar Agile IT environment experience 3+ years of designing AWS cloud based applications - Azure experience may be considered Serverless architecture
          This Company Wants to Make the Internet Load Faster      Cache   Translate Page      

The internet went down on February 28, 2017. Or at least that's how it seemed to some users as sites and apps like Slack and Medium went offline or malfunctioned for four hours. What actually happened is that Amazon's enormously popular S3 cloud storage service experienced an outage , affecting everything that depended on it.

It was a reminder of the risks when too much of the internet relies on a single service. Amazon gives customers the option of storing their data in different "availability regions" around the world, and within those regions it has multiple data centers in case something goes wrong. But last year's outage knocked out S3 in the entire North Virginia region. Customers could of course use other regions, or other clouds, as backups, but that involves extra work, including possibly managing accounts with multiple cloud providers.

A San Francisco based startup called Netlify wants to make it easier to avoid these sorts of outages by automatically distributing its customers’ content to multiple cloud computing providers. Users don't need accounts with Amazon, Microsoft Azure, Rackspace, or any other cloud company―Netlify maintains relationships with those services. You just sign-up for Netlify, and it handles the rest.

You can think of the company's core service as a cross between traditional web hosting providers and content delivery networks, like Akamai, that cache content on servers around the world to speed up websites and apps. Netlify has already attracted some big tech names as customers, often to host websites related to open source projects. For example, Google uses Netlify for the website for its infrastructure management tool Kubernetes, and Facebook uses the service for its programming framework React. But Netlify founders Christian Bach and Mathias Biilmann don't want to just be middlemen for cloud hosting. They want to fundamentally change how web applications are built, and put Netlify at the center.

Traditionally, web applications have run mostly on servers. The applications run their code in the cloud, or in a company's own data center, assemble a web page based on the results, and send the result to your browser. But as browsers have grown more sophisticated, web developers have begun shifting computing workloads to the browser. Today, browser-based apps like Google Docs or Facebook feel like desktop applications. Netlify aims to make it easier to build, publish, and maintain these types of sites.

Back to the Static Future

Markus Seyfferth, the COO of Smashing Media, was converted to Netlify's vision when he saw Biilman speak at a conference in 2016. Smashing Media, which publishes the web design and development publication Smashing Magazine and organizes the Smashing Conference, was looking to change the way it managed its roughly 3,200-page website.

Since its inception in 2006, Smashing Magazine had been powered by WordPress, the content management system that runs about 32 percent of the web according to technology survey outfit W3Techs, along with e-commerce tools to handle sales of books and conference tickets and a third application for managing its job listing site. Using three different systems was unwieldy, and the company's servers struggled to handle the site’s traffic, so Seyfferth was looking for a new approach.

When you write or edit a blog post in WordPress or similar applications, the software stores your content in a database. When someone visits your site, the server runs WordPress to pull the latest version from the database, along with any comments that have been posted, and assembles it into a page that it sends to the browser.

Building pages on the fly like this ensures that users always see the most recent version of a page, but it's slower than serving prebuilt "static" pages that have been generated in advance. And when lots of people are trying to visit a site at the same time, servers can bog down trying to build pages on the fly for each visitor, which can lead to outages. That leads companies to buy more servers than they typically need; what’s more, servers can still be overloaded at times.

"When we had a new product on the shop, it needed only a couple hundred orders in one hour and the shop would go down," Seyfferth says.

WordPress and similar applications try to make things faster and more efficient by "caching" content to reduce how often the software has to query the database, but it's still not as fast as serving static content.

Static content is also more secure. Using WordPress or similar content managers exposes at least two "attack surfaces" for hackers: the server itself, and the content management software. By removing the content management layer, and simply serving static content, the overall "attack surface" shrinks, meaning hackers have fewer ways to exploit software.

The security and performance advantages of static websites have made them increasingly popular with software developers in recent years, first for personal blogs and now for the websites for popular open source projects.

In a way, these static sites are a throwback to the early days of the web, when practically all content was static. Web developers updated pages manually and uploaded pre-built pages to the web. But the rise of blogs and other interactive websites in the early 2000s popularized server-side applications that made it possible for non-technical users to add or edit content, without special software. The same software also allowed readers to add comments or contribute content directly to a site.

At Smashing Media, Seyfferth didn't initially think static was an option. The company needed interactive features, to accept comments, process credit cards, and allow users to post job listings. So Netlify built several new features into its platform to make a primarily static approach more viable for Smashing Media.

The Glue in the Cloud

Biilmann, a native of Denmark, spotted the trend back to static sites while running a content management startup in San Francisco, and started a predecessor to Netlify called Bit Balloon in 2013. He invited Bach, his childhood best friend who was then working as an executive at a creative services agency in Denmark, to join him in 2015 and Netlify was born.

Initially, Netlify focused on hosting static sites. The company quickly attracted high-profile open source users, but Biilman and Bach wanted Netlify to be more than just another web-hosting company; they sought to make static sites viable for interactive websites.

Open source programming frameworks have made it easier to build sophisticated applications in the browser . And there's a growing ecosystem of services like Stripe for payments, Auth0 for user authentication, and Amazon Lambda for running small chunks of custom code, that make it possible to outsource many interactive features to the cloud. But these types of services can be hard to use with static sites because some sort of server side application is often needed to act as a middleman between the cloud and the browser.

Biilmann and Bach want Netlify to be that middleman, or as they put it, the "glue" between disparate cloud computing services. For example, they built an e-commerce feature for Smashing Media, now available to all Netlify customers, that integrates with Stripe. It also offers tools for managing code that runs on Lambda.

Smashing Media switched to Netlify about a year ago, and Seyfferth says it's been a success. It's much cheaper and more stable than traditional web application hosting. "Now the site pretty much always stays up no matter how many users," he says. "We'd never want to look back to what we were using before."

There are still some downsides. WordPress makes it easy for non-technical users to add, edit, and manage content. Static site software tends to be less sophisticated and harder to use. Netlify is trying to address that with its own open source static content management interface called Netlify CMS. But it's still rough.

Seyfferth says for many publications, it makes more sense to stick with WordPress for now because Netlify can still be challenging for non-technical users.

And while Netlify is a developer darling today, it's possible that major cloud providers could replicate some of its features. Google already offers a service called Firebase Hosting that offers some similar functionality.

For now, though, Bach and Biilmann say they're just focused on making their serverless vision practical for more companies. The more people who come around to this new approach, the more opportunities there are not just for Netlify, but for the entire new ecosystem.

More Great WIRED Stories Self-improvement in the internet age andhow we learn A drone-flinging cannon proves UAVscan mangle planes Google's human-sounding phone bot comes to the Pixel How Jump designed aglobal electric bike US weapons systems are easy cyberattack targets Looking for more? Sign up for our daily newsletter and never miss our latest and greatest stories
          [TA Deals] Save big on the MCSA SQL server certification training bundle! (96% off)      Cache   Translate Page      
Getting certified in Microsoft SQL server management is going to be a big plus on your resume, and Talk Android Deals is offering a bundle to help get you started. The bundle includes two comprehensive courses dealing with SQL Server 2016 and Microsoft Azure, totaling around 200 lessons and over 50 hours worth of content. […]


Come comment on this article: [TA Deals] Save big on the MCSA SQL server certification training bundle! (96% off)

Visit TalkAndroid


          Engineer, R & D (Software) - Venture International Pte Ltd - Ang Mo Kio      Cache   Translate Page      
I.e. AWS, Azure or Google Cloud. Web APIs such as google map API etc. Design and develop mobile, PC applications (may include full stack) for use as companion...
From Venture Corporation Limited - Fri, 13 Jul 2018 11:23:30 GMT - View all Ang Mo Kio jobs
          Software Developer - Jobline Resources Pte Ltd - Paya Lebar      Cache   Translate Page      
Familiar with cloud environment – AWS, Azure, Google Cloud, principals of, and execution in, DevOps. The development environment consists of AWS Cloud and...
From Jobline Resources Pte Ltd - Fri, 02 Nov 2018 09:24:59 GMT - View all Paya Lebar jobs
          Running Java on Azure      Cache   Translate Page      
Azure is Microsofts cloud platform. It is the home of Service Apps, Logic Apps, cloud storage, Kubernetes Service and provides the foundation for VSTS (now Azure DevOps), Office 365 and loads of other services and tools. But not only for .NET based services and applications. Todays Microsoft provides options for Linux developers, OSX teams, Docker containers, Python code, Node.js and
          Building Statewide Infrastructure for Effective Educational Services for Students With TBI: Promising Practices and Recommendations      Cache   Translate Page      
imageObjective: To identify promising practices in educational service delivery. Methods: Consensus-building process with a multidisciplinary group of researchers, policy makers, and state Department of Education personnel. Results: This white paper presents the group's consensus on the essential components of a statewide educational infrastructure to support students with traumatic brain injury across the spectrum of injury severity: (a) identification, screening, and assessment practices; (b) systematic communication between medical and educational systems; (c) tracking of child's progress over time; and (d) professional development for school personnel. The white paper also presents key outcomes for measuring success and provides recommendations both for policy change and for furthering research in childhood brain injury.
          Kqlmagic 0.1.75      Cache   Translate Page      
Kqlmagic: Microsoft Azure Monitor magic extension to Jupyter notebook
          Kqlmagic 0.1.74      Cache   Translate Page      
Kqlmagic: Microsoft Azure Monitor magic extension to Jupyter notebook
          Ge 2018 Latest AZ-100 Dumps - Exam Questions PDF      Cache   Translate Page      
Added: Nov 06, 2018
By: taragill
Views: 3
If you are looking for Microsoft Azure Infrastructure and Deployment Exam Study material then you are at right place. Braindumpspdf compiled AZ-100 dumps pdf have been studied and appropriate by industry professionals and persons who have taken and passed these exams. All AZ-100 exam questions are verified and with accurate answers. For the sake of students AZ-100 dumps are in pdf format which is handy and easy to print. Download AZ-100 braindumps pdf now and save money as well as your time. https://www.braindumpspdf.com/exam/Az-100.html

          Walmart переносит тысячи своих бизнес-приложений в облачную инфраструктуру Azure      Cache   Translate Page      

Американская компания Walmart, управляющая крупнейшей в мире сетью оптовой и розничной торговли, переводит нескольких тысяч своих бизнес-приложений в облачную инфраструктуру Azure. Компания проводит цифровую трансформацию, для чего начала сотрудничество с Microsoft. По условиям соглашения, Azure стал для Walmart предпочтительным и стратегическим облачным провайдером. Американская сеть […]

The post Walmart переносит тысячи своих бизнес-приложений в облачную инфраструктуру Azure appeared first on InfoCity.


          Hardware Engineer II - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 25 Oct 2018 05:51:24 GMT - View all Redmond, WA jobs
          Microsoft Azure 2018 年 10 月の Update      Cache   Translate Page      
こんにちは。 2018 年 10 月のAzure の Update の情報です。 最新のUpdateの情報は、下記のサイトを参照ください。   ■Azure Update https://a ...

Copyright © 2018 メモログ All Rights Reserved.


          Time To Reboot      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2018/04/05/time-to-reboot.aspx

Image result for reboot

It has been nearly six months since I blogged last.  Thankfully this means I have been busy working on client projects.  It is a new year and just back from spring break so I think it is time to start digging into technical topics again.  This post will actually help to do some testing for an Azure Logic App POC that I am working on.  Watch for a future post on this and other Azure topics. Until then …


          Azure Functions Visual Studio 2017 Development      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2017/08/10/azure-functions-visual-studio-2017-development.aspx

Image result for azure functions logo

The development tools and processes for Azure Functions are ever changing.  We started out only being able to create a function through the portal which I did a series on.  We then got a template in VS2015, but it really didn’t work very well.  They have since been able to create functions as Web Application libraries and now we are close the the release of a VS2017 template.

This post will walk through the basics of using the VS2017 Preview with the Visual Studio Tools For Azure Functions which you can download here.

Create New Project

To create the initial solution open up the New Project dialog and find the Azure Function project type, name your project and click OK.

image_thumb10

Create New Function

To add a function to your project, right-click the project and select New Item.  In the New Item dialog select Azure Function and provide a name for the class and click Add. 

image_thumb12

The next dialog which will appear is the New Azure Function dialog.  Here you will select the function trigger type and its parameters.  In the example below a timer trigger has been selected and a Cron schedule definition is automatically defined to execute every 5 minutes.

Also in this dialog you can set the name of the function.  When you compile a folder will be created with that name in you bin directory which will be used later for deployment.

image_thumb14

Add Bindings

With each generation of Azure Function development the way you initially define bindings changes (even if they stay the same behind the scenes).  Initially you had to use the portal Integrate page.  This had its advantages.  It would visually prompt you for the type of binding and the parameters for that binding.

With the Visual Studio template you have to add attributes to the Run method of your function class.  This requires that you know what the attribute names are and what parameters are available and their proper values.  You can find a list of the main binding attributes here.

At compile time the attributes will be used to generate a function.json file with your trigger and bindings definition.

Add NuGet Packages

If you are building functions in the portal you have to create a projects.json file that defines the packages you want to include.  This requires that you know the format of the file.  Thankfully with the Visual Studio template you can use the normal Nuget Package manager.

Deploying

There are a couple of ways to deploy your solution.  In the end a Function App is a specialized App Service.  This means you have the same deployment options of Visual Studio, PowerShell or via VSTS continuous deployment.  The main difference is that you don’t have a web.config file and have to manage you app settings and connection strings through the portal.  This can be reached by following the Application Settings link under the Configured Features section of the Function App Overview page.

image

Summary

While creating Azure Functions still isn’t a WYSIWYG turn key process the latest incarnation gives us an ALM capable solution.  I believe this is the development approach that will stabilize for the foreseeable future and anyone who is creating Functions should invest in learning.


          Query Application Insights REST API To Create Custom Notifications      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2017/08/04/query-application-insights-rest-api-to-create-custom-notifications.aspx

Image result for azure application insights logo

Application Insights is one of those tools that has been around for a number of years now, but is finally getting understood as more companies move to Azure as a cloud solution.  It has become an amazing tool for monitoring the performance of your application, but it can also work as a general logging platform as I have posted before.

Now that you are capturing all this information how can you leverage it?  Going to the Azure portal whenever you want an answer is time consuming.  It would be great if you could automate this process.  Of course there are a number of metrics that you can create alerts for directly via the portal, but what if you want a non-standard metric or want to do something beside just send an alert?

Fortunately Microsoft has a REST API in beta for Application Insights.  It allows you to check standard metrics as well as run custom queries as you do in the Analytics portal.  Let’s explore how to use this API.

In this post will show how to create a demo that implements an Azure Function which calls the Application Insights REST API and then send the results out using SendGrid.  I created them with the VS2017 Preview and the new Azure Functions templates.

Generate Custom Events

First we need some data to work with.  The simplest way is to leverage the TrackEvent and TrackException method of the Application Insights API.  In order to do this you first need to setup a TelemetryClient.  The code below I have as part of the class level variables.

        private static string appInsightsKey = System.Environment.GetEnvironmentVariable("AppInsightKey", EnvironmentVariableTarget.Process);
        private static TelemetryClient telemetry = new TelemetryClient();
        private static string key = TelemetryConfiguration.Active.InstrumentationKey = appInsightsKey; //System.Environment.GetEnvironmentVariable("AN:InsightKey", EnvironmentVariableTarget.Process);

After that it is simple to call the TrackEvent method on the TelemetryClient object to log an activity in your code (be aware it may take 5 minutes for an event to show up in Application Insights).

            telemetry.TrackEvent($"This is a POC event");

Create a VS2017 Function Application

I will have another post on the details in the future, but if you have Visual Studio 2017 Preview 15.3.0 installed you will be able to create an Azure Functions project.

image

Right click the project and select the New Item context menu option and select Azure Function as shown below.

image

On the New Azure Function dialog select TimerTrigger and leave the remaining options as default.

image

Call Application Insights REST API

Once there are events in the customEvents collection we can write a query and execute it against the Application Insights REST API.  To accomplish this the example uses a simple HttpClient call.  The API page for Application Insights can be found here and contains the ULRs and formats for each call type.  We will be using the Query API scenario which will be setup with a couple of variables.

        private const string URL = "https://api.applicationinsights.io/beta/apps/{0}/query?query={1}";
        private const string query = "customEvents | where timestamp >= ago(20m) and name contains \"This is a POC event\" | count";

The call to the service is a common pattern using the HttpClient as shown below.  Add this to the Run method of your new function.

            HttpClient client = new HttpClient();
            client.DefaultRequestHeaders.Accept.Add(
                new MediaTypeWithQualityHeaderValue("application/json"));
            client.DefaultRequestHeaders.Add("x-api-key", appInsightsApiKey);
            var req = string.Format(URL, appInsightsId, query);
            HttpResponseMessage response = client.GetAsync(req).Result;

Process Results

After we have a result we can deserialize the JSON using JSON.NET and send it to our support team via SendGrid.  You will have to add the NuGet package Microsoft.Azure.WebJobs.Extensions.SendGrid.

Modify the signature of your function’s Run method to match the code sample shown here.  In this example “message” is defined as an output variable for the Azure Function which is defined as a binding by using the SendGrid attribute. 

        public static void Run([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, TraceWriter log, [SendGrid(ApiKey = "SendGridApiKey")]out Mail message)

We will also need a structure to deserialize the returned JSON message into. If you look at the message itself it can appear rather daunting but it breaks down into the following class structure.  Create a new class file and replace the default class with this code.

    public class Column
    {
        public string ColumnName { get; set; }
        public string DataType { get; set; }
        public string ColumnType { get; set; }
    }

    public class Table
    {
        public string TableName { get; set; }
        public List<Column> Columns { get; set; }
        public List<List<object>> Rows { get; set; }
    }

    public class RootObject
    {
        public List<Table> Tables { get; set; }
    }

The last code example below performs the deserialization and creates the SendGrid email message.  Insert this to the Run method after the HttpClient call we previously added.

                string result = response.Content.ReadAsStringAsync().Result;
                log.Info(result);

                RootObject aiResult = JsonConvert.DeserializeObject<RootObject>(result);

                string countString = aiResult.Tables[0].Rows[0][0].ToString();

                string recipientEmail = System.Environment.GetEnvironmentVariable($"recipient", EnvironmentVariableTarget.Process);
                string senderEmail = System.Environment.GetEnvironmentVariable($"sender", EnvironmentVariableTarget.Process);

                var messageContent = new Content("text/html", $"There were {countString} POC records found");

                message = new Mail(new Email(senderEmail), "App Insights POC", new Email(recipientEmail), messageContent);

Publish your solution to an Azure Function App by downloading the Function App’s profile and using the VS2017 projects publish options.  You will also need to define the application settings referred to in the code so that they are appropriate for you environment.  At that point you will be able to observe the results of you efforts.

Summary

This post demonstrates how a small amount of code can give you the ability to leverage Application Insights for more than just out of the box statistics alerts.  This approach is flexible enough to be use for report on types of errors and monitoring if subsystems are remaining available.  Combining the features within Azure’s cloud offerings gives you capabilities that would cost much more in development time and resource if they were done on premises. 

My only real problem with this approach is that I would prefer to be accessing values in the result by name rather than indexes because this makes the code less readable and more brittle to changes.

Try these examples out and see what other scenarios they apply to in your business.


          Logging To Application Insights In Azure Functions      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2017/02/16/logging-to-application-insights-in-azure-functions.aspx

In my last post I covered logging in Azure Functions using TraceWriter and log4net.  Both of these work, but Application Insights rolls all your monitoring into one solution, from metrics to tracking messages.  I have also heard a rumor that in the near future this will be an integrated part of Azure Functions.  Given these factors it seem wise to start give it a closer look.

So how do you take advantage of them right now?  If you go to GitHub there is a sample written by Christopher Anderson, but let me boil this down.  First we need to create an Application Insight instance and grab the instrumentation key.

When I created my Application Insight instance I chose the General application type and the same resource group as my function app.

image

Once the instance has been allocated you will need to go into the properties blade.  There you will find a GUID for the Instrumentation Key.  Save this off so that we can use it later.

You then need to add the Microsoft.ApplicationInsights NuGet package by creating a project.json file in your function.  Insert the following code in the new file and save it.  If you have your log window open you will see the package being loaded.

 {   
  "frameworks": {   
   "net46":{   
    "dependencies": {   
     "Microsoft.ApplicationInsights": "2.1.0"   
    }   
   }   
   }   
 }  

In the sample code read.me it says that you need to add a specific app setting, but as long as your code reads from the appropriate setting that is the most important part.  Take the Instrumentation Key that you saved earlier and place it in the app settings.  In my case I used one called InsightKey.  

Next setup your TelemetryClient object like the code here by creating global static variables that can be used throughout your application.  After that we are ready to start tracking our function. 

 private static TelemetryClient telemetry = new TelemetryClient();   
 private static string key = TelemetryConfiguration.Active.InstrumentationKey = System.Environment.GetEnvironmentVariable("InsightKey", EnvironmentVariableTarget.Process);  

To track and event or an exception simply call the appropriate method.  I prefer to encapsulate them in their own methods where I can standardize the usage.  I have added the function name, method name, and context ID from the function execution to make it easier to search and associate entries.

 private static void TrackEvent(string desc, string methodName)   
 {   
   telemetry.TrackEvent($"{FunctionName} - {methodName} - {contextId}: {desc}");   
 } private static void TrackException(Exception ex, string desc, string methodName)   
 {   
   Dictionary<string,string> properties = new Dictionary<string,string>() {{"Function",FunctionName}, {"Method",methodName}, {"Description",desc}, {"ContextId",contextId}};   
   telemetry.TrackException(ex, properties);   
 }  

Analytics

This isn’t an instant answer type of event store.  At the very least there is a few minute delay your application logging and event or exception and when it is visible in the Analytics board.

Once you are logging and sending metrics to Application Insights you need to read the results.  From your Application Insight main blade click on the Analytics button at the top of the overview.  It will open a new page that resembles what you see below.

image

Click the new tab button at the top next to the Home Page tab.  This will open a query window. The query language has a similar structure to SQL, but that is about as far as it goes.

The table objects are listed on the left navigation with the fields listed as you expand out each table.  Fortunately intellisense works pretty well in this tool.  You have what would normally be considered aggregate functions that make life easier.  As you can see below you can use the contains syntax that acts similar to a SQL like comparison.  There are also date range functions like the ago function used below.  I found that these two features can find most things you are looking for.

image

Summary

This posted didn’t cover a lot of the native functionality in Application Insight, but hopefully it gives you a starting point to instrument your Azure Functions.  The flexibility of this tool along with it the probability of it being built into Functions in the future make it a very attractive option.  Spend some time experimenting with it and I think you find it will pay dividends.


          Implementing Logging In Azure Functions      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2017/02/13/implementing-logging-in-azure-functions.aspx

image

Logging is essential to the support of any piece of code.  In this post I will cover two approaches to logging in Azure Functions: TraceWriter and log4net.

TraceWriter

The TraceWriter that is available out of the box with Azure Functions is a good starting point.  Unfortunately it is short lived and only 1000 messages are kept at a maximum and at most they are held in file form for two days.  That being said, I would not skip using the TraceWriter.

Your function will have a TraceWriter object passed to it in the parameters of the Run method.  You can use the Debug, Error, Fatal, Info and Warn methods to write different types of messages to the log as shown below.

log.Info($"Queue item received: {myQueueItem}");

Once it is in the log you need to be able to find the messages.  The easiest way to find the log files is through Kudu.  You have to drill down from the LogFiles –> Application –> Functions –> Function –> <your_function_name>.  At this location you will find a series of .log files if you function has been triggered recently.

image

The other way to look at your logs is through Table Storage via the Microsoft Azure Storage Explorer.  After attaching to your account open the storage account associated with your Function App.  Depending on how you organized your resource groups you can find the storage account by looking at the list of resources in the group that the function belongs to.

Once you drill down to that account look for the tables named AzureWebJobHostLogsyyyymm as you see below.

image

Opening these tables will allow you to see the different types of log entries saved by the TraceWriter.  If you filter to the partition key “I” you will see the entries your functions posted.  You can further filter name and date range to identify specific log entries.

image

log4net

If the default TraceWriter isn’t robust enough you can implement logging via a framework like log4net.  Unfortunately because of the architecture of Azure Functions this isn’t as easy as it would be with a normal desktop or web application.  The main stumbling block is the lack of ability to create custom configuration sections which these libraries rely on.  In this section I’ll outline a process for getting log4net to work inside your function.

The first thing that we need is the log4net library.  Add the log4net NuGet package by placing the following code in the project.json file.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "log4net": "2.0.5"
      }
    }
   }
}

To get around the lack of custom configuration sections we will bind a blob file with your log4net configuration.  Simply take the log4net section of and save it to a text file.  Upload that to a storage container and bind it to your function using the full storage path.

image

Add the references to the log4net library and configure the logger.  Once you have that simply call the appropriate method on the logger and off you go.  A basic sample of the code for configuring and using the logger is listed below.  In this case I am actually using a SQL Server appender.

using System;
using System.Xml;
using log4net;

public static void Run(string input, TraceWriter log, string inputBlob)
{
    log.Info($"Log4NetPoc manually triggered function called with input: {input}");
    log.Info($"{inputBlob}");

    XmlDocument doc = new XmlDocument();
    doc.LoadXml(inputBlob);
    XmlElement element = doc.DocumentElement;

    log4net.Config.XmlConfigurator.Configure(element);

    ILog logger = LogManager.GetLogger("AzureLogger");

    logger.Debug($"Test log message from Azure Function", new Exception("This is a dummy exception"));
   
}

Summary

By no means does this post cover every aspect of these two logging approaches or all possible logging approaches for Azure Functions.  In future posts I will also cover AppInsight.  In any case it is always important to have logging for you application.  Find the tool that works for your team and implement it.


          Building Azure Functions: Part 3 – Coding Concerns      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2017/02/02/building-azure-functions-part-3-ndash-coding-concerns.aspx

Image result for azure functions logo

In this third part of my series on Azure Function development I will cover a number of development concepts and concerns.  These are just some of the basics.  You can look for more posts coming in the future that will cover specific topics in more detail.

General Development

One of the first things you will have to get used to is developing in a very stateless manner.  Any other .NET application type has a class at its base.  Functions, on the other hand, are just what they say, a method that runs within its own context.  Because of this you don’t have anything resembling a global or class level variable.  This means that if you need something like a logger in every method you have to pass it in.

[Update 2016-02-13] The above information is not completely correct.  You can implement function global variables by defining them as private static.

You may find that it makes sense to create classes within your function either as DTOs or to make the code more manageable.  Start by adding a .csx file in the files view pane of your function.  The same coding techniques and standards apply as your Run.csx file, otherwise develop the class as you would any other .NET class.

image

In the previous post I showed how to create App Settings.  If you took the time to create them you are going to want to be able to retrieve them.  The GetEnvironmentVariable method of the Environment class gives you the same capability as using AppSettings from ConfigurationManager in traditional .NET applications.

System.Environment.GetEnvironmentVariable("YourSettingKey")

A critical coding practice for functions that use perishable resources such as queues is to make sure that if you catch and log an exception that you rethrow it so that your function fails.  This will cause the queue message to remain on the queue instead of dequeuing.

Debugging

image

It can be hard to read the log when the function is running full speed since instance run in parallel but report to the same log.  I would suggest that you added the process ID to your TraceWriter logging messages so that you can correlate them.

Even more powerful is the ability to remote debug functions from Visual Studio.  To do this open your Server Explorer and either connect to your Azure subscription.  From there you can drill down to the Function App in App Services and then to the run.csx file in the individual function.  Once you have open the code file and place your break points, right-click the function and select Attach Debugger.  From there it acts like any other Visual Studio debugging session.

image

Race Conditions

I wanted to place special attention on this subject.  As with any highly parallel/asynchronous processing environment you will have to make sure that you take into account any race conditions that may occur.  If at all possible keep the type of functionality that your create to non-related pieces of data.  If it is critical that items in a queue, blob container or table storage are processed in order then Azure Functions are probably not the right tool for your solution.

Summary

Azure Functions are one of the most powerful units of code available.  Hopefully this series gives you a starting point for your adventure into serverless applications and you can discover how they can benefit your business.


          Building Azure Functions: Part 2–Settings And References      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2017/02/01/building-azure-functions-part-2ndashsettings-and-references.aspx

Image result for azure functions logo

This is the second post in a series on building Azure Functions.  In this post I’ll continue by describing how to add settings to your function and reference different assemblies to give you more capabilities.

Settings

image_thumb1[1]

Functions do not have configuration files so you must add app settings and connection strings through the settings page.  The settings are maintained at an Function App level and not individual functions.  While this allows you to share common configuration values it means that if your custom assemblies need different values in configuration settings per function they will each function will have to live in a separate function app.

To get to them go to the Function App Settings link at the lower left of your App Function’s main page and then click the Configure App Settings button which will bring you to the blade shown below.  At that point it is the same any .NET configuration file.

image

At some point I would like to see the capability of importing and exporting settings since maintaining them individually, by hand leads to human error and less reliable application lifecycle management.

Another drawback to the Azure Functions development environment is that at the time of this post you don’t have the ability to leverage custom configuration sections.  The main place I have found this to cause heartburn is using logging libraries such as log4net where the most common scenario is to use a custom configuration section to define adapters and loggers.

Referencing Assemblies And Nuget

No .NET application is very useful if you can’t reference all of the .NET Framework as well as third party and your own custom assemblies.  There is no add references menu for Azure functions and there are multiple ways to add references.  Lets take a look at each.

There are a number of .NET assemblies that are automatically referenced for your Function application.  There are a second group of assemblies that are available but need to be specifically reference.  For a partial list consult the Azure Function documentation here.  You can also load your own custom assemblies or bring in Nuget packages. 

In order to load Nuget packages you need to create a project.json file.  Do this by clicking the View Files link in the upper right corner of the editor blade and then the Add link below the file list pane. 

project.json files require the same information that is contained in packages.config file, but it is formatted in json as shown in the example below.  Once you save this file and reference the assembly in your Run.csx file Azure will load the designated packages.

image_thumb8

If you have custom libraries that you want to leverage you will need to add a bin folder to your function.  The easiest way I have found to do this is to open the App Service Editor from the Function App Settings page.  This will open up what is essentially Visual Studio Code in a browser.  Navigate the file tree to your function under wwwroot.  Right click your function name and select New Folder.  The folder must be named “bin”.  You can then right click the bin folder and upload your custom assemblies.

Once you have an assembly available you need to reference it using the “r#” directive as shown below.  You will notice that native assemblies and Nuget package loaded libraries do not need the dll extension specified, but they must be added for custom assemblies.

#r "System.Xml"
#r "System.Xml.Linq"
#r "System.Data.Entity"
#r "My.Custom.Data.dll"
#r "My.Custom.Domain.dll"
#r "Newtonsoft.Json"
#r "Microsoft.Azure.Documents.Client"
#r "Microsoft.WindowsAzure.Storage"

Now we are ready to declare our normal using statements and get down to the real business of functions.

Summary

After this post we have our trigger, bindings, settings and dependent assemblies.  This still isn’t enough for a useful function.  In the next post I will cover coding and debugging concerns to complete the story.


          Building Azure Functions: Part 1–Creating and Binding      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2017/01/31/building-azure-functions-part-1ndashcreating-and-binding.aspx

Image result for azure functions logo

The latest buzz word is serverless applications.  Azure Functions are Microsoft’s offering in this space.  As with most products that are new on the cloud Azure Functions are still evolving and therefore can be challenging to develop.  Documentation is still being worked on at the time I am writing this so here are some things that I have learned while implementing them.

There is a lot to cover here so I am going to break this topic into a few posts:

  1. Creating and Binding
  2. Settings and References
  3. Coding Concerns

Creating A New Function

The first thing you are going to need to do is create a Function App.  This is a App Services product that serves as a container for your individual functions.  The easiest way I’ve found to start is to go to the main add (+) button on the Azure Portal and then do a search for Function App.

image

Click on Function App and then the Create button when the Function App blade comes up.  Fill in your app name remembering that this a container and not your actual function.  As with other Azure features you need to supply a subscription, resource group and location.  Additionally for a Function App you need to supply a hosting plan and storage account.  If you want to take full benefit of Function Apps scaling and pricing leave the default Consumption Plan.  This way you only pay for what you use.  If you chose App Service Plan your function will will pay for it whether it is actually processing or not.

image

Once you click Create the Function App will start to deploy.  At this point you will start to create your first function in the Function App.  Once you find your Function App in the list of App Services it will open the blade shown below.  It offers a quick start page, but I quickly found that didn’t give me options I needed beyond a simple “Hello World” function.  Instead press the New Function link at the left.  You will be offered a list of trigger based templates which I will cover in the next section.

image

Triggers

image

Triggers define the event source that will cause your function to be executed.  While there are many different triggers and there are more being added every day, the most common ones are included under the core scenarios.  In my experience the most useful are timer, queue, and blob triggered functions.

Queues and blobs require a connection to a storage account be defined.  Fortunately this is created with a couple of clicks and can be shared between triggers and bindings as well as between functions.  Once you have that you simply enter the name of the queue or blob container and you are off to the races.

When it comes to timer dependent functions, the main topic you will have to become familiar with is chron scheduling definitions.  If you come from a Unix background or have been working with more recent timer based WebJobs this won’t be anything new.  Otherwise the simplest way to remember is that each time increment is defined by a division statement.

image

In the case of queue triggers the parameter that is automatically added to the Run method signature will be the contents of the queue message as a string.  Similarly most trigger types have a parameter that passes values from the triggering event.

Input and Output Bindings

image

Some of the function templates include an output binding.  If none of these fit your needs or you just prefer to have full control you can add a binding via the Integration tab.  The input and output binding definitions end up in the same function.json file as the trigger bindings. 

The one gripe I have with these bindings is that they connect to a specific entity at the beginning of your function.  I would find it preferable to bind to the parent container of whatever source you are binding to and have a set of standard commands available for normal CRUD operations.

Let’s say that you want to load an external configuration file from blob storage when your function starts.  The path shown below specifies the container and the blob name.  The default format show a variable “name” as the blob name.  This needs to be a variable that is available and populated when the function starts or an exception will be thrown.  As for your storage account specify it by clicking the “new” link next to the dropdown and pick the storage account from those that you have available.  If you specified a storage account while defining your trigger and it is the same as your binding it can be reused.

image

The convenient thing about blob bindings is that they are bound as strings and so for most scenarios you don’t have to do anything else to leverage them in your function.  You will have to add a string parameter to the function’s Run method that matches the name in the blob parameter name text box.

Summary

That should give you a starting point for getting the shell of your Azure Function created.  In the next two posts I will add settings, assembly references and some tips for coding your function.


          Cloud Battles: Azure vs AWS–The Video      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2016/06/29/cloud-battles-azure-vs-awsndashthe-video.aspx

Earlier this month Norm Murrin and I gave a talk at the Chicago Coder Conference.  We learned a lot about how the offerings of each company compares during our preparation.  In the end we come to the conclusion that there is no clear winner except those of us who are leveraging the resources.  Check out this video posted by the conference do get the blow-by-blow details.


          Application Integration: Azure Functions Vs WebJobs      Cache   Translate Page      

Originally posted on: http://bobgoedkoop.nl/archive/2016/06/02/application-integration-azure-functions-vs-webjobs.aspx

image

[Updated]

UI development gets all the attention, but application integration is where the real work is done.  When it comes to application integration in the Azure ecosystem you better learn how Functions and WebJobs are developed and under what conditions you should use each.  In this post I will try to answer those questions.

For me it is important that a solutions is reasonably maintainable, deployable through environments and can be easily managed under source control.

Both products are built on the same code base and share the same base API.  From that perspective they are closely matched.  Functions do have the advantage of handling web hooks as opposed to simply timer and storage events with WebJobs.

There is another difference that I haven’t been able to prove you, but I’ve seen mentioned in a couple of places.  It seems like Functions may take time to warm up since they aren’t always instantiated.  Since WebJobs are always running they would not incur this startup cost.  If immediate processing is important then WebJobs may be the more appropriate options for you.

When it comes to actual development I prefer to have the resources of Visual Studio to write and manage source code as well as package my deliverables for deployment.  As of this writing I have not been able to find a Visual Studio project type.  This means you edit the code through a web browser.  This in portal editor does allow you to integrate with Git or VSTS for source control.  I would expect at some point in the future we will get a Functions project type.

Both WebJobs and Functions can be written using C#/VB.NET and Node.js.  From the language availability perspective they are even.

Summary

So what is the real separating line between using one or the other.  From what I have experienced so far, if you need the web hooks then Functions are the right choice.  If you don’t need the web hooks and maintainability is you priority then WebJobs are the way to go.  I’m sure there are more reason, but these are the most obvious in the early days of Functions.  As the products evolve I’ll post updates.

[Update]

Christopher Anderson (@crandycodes) from the Azure team replied via Twitter with the following:

You hit on some key points like lack of tooling/VS integration. We plan on addressing those before GA.
I think the major point missing is the dynamic scale functionality, pay per use. Functions scale automatically and don't cost a VM.
Also, if you run Functions in dedicated with always on, there is no cold start issues, but you pay per VM at that point.
WebJobs vs Functions is really: "Do I want to manage my own custom service?" Yes: WebJobs, No: Functions. Otherwise, similar power.


          Hadoop Admin - Syntel Inc - Madison, WI      Cache   Translate Page      
Azure blobs storage, Azure ML studio, Azcopy, OMS and Databricks Networking- VMs, Vnets, Subnets, Azure SQL Datawarehouse. We are looking for *Hadoop**Admin*....
From Indeed - Wed, 10 Oct 2018 20:42:08 GMT - View all Madison, WI jobs
          Principal Software Engineering Manager - Microsoft - Redmond, WA      Cache   Translate Page      
Experience in Azure based analytics, storage, and reporting – Azure data lake, Azure Datawarehouse, HD Insight, Azure Data Factory....
From Microsoft - Sat, 27 Oct 2018 03:43:14 GMT - View all Redmond, WA jobs
          Principal Software Development Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
The Azure SQL Datawarehouse Service team is looking for a highly qualified, experienced individual to help us shape and implement our next generation data...
From Microsoft - Sat, 13 Oct 2018 03:12:19 GMT - View all Redmond, WA jobs
          Sr. Software Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
Experience in Azure based analytics, storage, and reporting – Azure data lake, Azure Datawarehouse, HD Insight, Azure Data Factory....
From Microsoft - Tue, 09 Oct 2018 15:27:08 GMT - View all Redmond, WA jobs
          Senior SWE Manager - Microsoft - Issaquah, WA      Cache   Translate Page      
Experience in Azure based analytics, storage, and reporting – Azure data lake, Azure Datawarehouse, HD Insight, Azure Data Factory....
From Microsoft - Fri, 19 Oct 2018 05:03:40 GMT - View all Issaquah, WA jobs
          acs-engine (0.25.0)      Cache   Translate Page      
Azure Container Service Engine - a place for community to collaborate and build the best open Docker container infrastructure for Azure.

          Disease Mechanisms-Parkinson's Disease: Activity in Genomic Dark Matter Offers Clues to Nonmotor Symptoms in Parkinson's and Neuropsychiatric Disease      Cache   Translate Page      
imageNo abstract available
          IoT for Smart Cities: New partnerships for Azure Maps and Azure Digital Twins      Cache   Translate Page      
Over recent years, one of the most dynamic landscapes undergoing digital transformation is the modern city. Amid increasing urbanization, cities must grapple with questions around how to strengthen their local economies, manage environmental resources, mitigate pollution and create safer, more accessible societies.
          Mission critical performance with Ultra SSD for SQL Server on Azure VM      Cache   Translate Page      
We recently published Storage Configuration Guidelines for SQL Server on Azure VM summarizing the test findings from running TPC-E profile test workloads on premium storage configuration options. We continued this testing by including Ultra SSD.
          Microsoft Azure portal November 2018 update      Cache   Translate Page      
In October 2018, we started a monthly blog series to help you find everything that is new in the Microsoft Azure portal and the Azure mobile app in one place. We are constantly working to make it easier for you to manage your Azure environment, and we want you to be able to stay up to speed with everything that’s new.
          Run your LOB applications with PostgreSQL powered by the plv8 extension      Cache   Translate Page      
We are extremely excited to announce that the plv8 extension for PostgreSQL is now enabled in all generally available regions of the Azure Database for PostgreSQL service. The plv8 extension was one of the highly requested UserVoice asks from our growing customer base and the PostgreSQL community.
          Azure SQL Database and Azure Database for MySQL at PASS SUMMIT!      Cache   Translate Page      
Get inspired by our industry-leading keynote speakers on the Microsoft data platform at PASS Summit 2018! There has never been a more exciting time for data professionals and developers as more organizations turn to data-driven insights to stay ahead and prepare for the future.
          Best practices for alerting on metrics with Azure Database for PostgreSQL monitoring      Cache   Translate Page      
Whether you are a developer, database administrator, site reliability engineer, or a DevOps professional at your company, monitoring databases is an important part of maintaining the reliability, availability, and performance of your PostgreSQL server.
          Best practices for alerting on metrics with Azure Database for MySQL monitoring      Cache   Translate Page      
Whether you are a developer, database administrator, site reliability engineer, or a DevOps professional at your company, monitoring databases is an important part of maintaining the reliability, availability, and performance of your PostgreSQL server.
          Azure.Source - Volume 56      Cache   Translate Page      
Deliver the right events to the right places with Event Domains, Avere vFXT for Azure for HPC workloads now generally available, Design patterns – IoT and aggregation, and more. Catch up on Azure in one post.
          Moovit to provide public transit data for Microsoft Azure Maps      Cache   Translate Page      

The post Moovit to provide public transit data for Microsoft Azure Maps appeared first on Stories.


          Sr Principal Infra Engg - Mphasis - Bengaluru, Karnataka      Cache   Translate Page      
Primary Comp & : Cloud Computing - Azure Architect Skill Percentage - 100 Skill Version - 1 Proficiency - Advanced(>5 & <=9yrs) EDUCATION : ITO NICHE SKILLS -...
From Mphasis - Tue, 06 Nov 2018 12:28:56 GMT - View all Bengaluru, Karnataka jobs
           Comment on Use Pester to Test Azure Resource Manager Templates for Best Practices by Daniel Scott-Raynsford (MVP)       Cache   Translate Page      
I've run into this as well. Some Azure Resource Providers don't appear to be supported (yet) by this module so they get skipped. You can probably check over here for more info: https://github.com/azsk/DevOpsKit
           Comment on Use Pester to Test Azure Resource Manager Templates for Best Practices by bilal       Cache   Translate Page      
I tries the Azure DevOps AZsk extension , but some how it can't scan every template. Not all resources are being scanned through this extension
          BCS Technology's Blockchain Solution Now Available in the Microsoft Azure Marketplace      Cache   Translate Page      
...to start. This workshop gives executives the opportunity to get strapped with the know-how to improve and transform their business model as a team. Change is driven from top leaders, thus understanding the basis of the technology is essential to ...

          LXer: We love Kubernetes, but it's playing catch-up with our Service Fabric, says Microsoft Azure exec      Cache   Translate Page      
Published at LXer: Jason Zander on cloud native, Red Hat, and figuring out open sourceInterview A curious feature of Microsoft's cloud platform is that it has two fundamentally different platforms...
          RRM1 gene expression evaluated in the liquid biopsy (blood cfRNA) as a non-invasive, predictive factor for radiotherapy-induced oral mucositis and potential prognostic biomarker in head and neck cancer patients.      Cache   Translate Page      
Icon for IOS Press Related Articles

RRM1 gene expression evaluated in the liquid biopsy (blood cfRNA) as a non-invasive, predictive factor for radiotherapy-induced oral mucositis and potential prognostic biomarker in head and neck cancer patients.

Cancer Biomark. 2018;22(4):657-667

Authors: Mlak R, Powrózek T, Brzozowska A, Homa-Mlak I, Mazurek M, Małecka-Massalska T

Abstract
BACKGROUND: Intensified treatment of head and neck cancers (HNC): by radiotherapy (RTH) commonly combined with cytotoxic drugs is associated with oral mucositis (OM). Changes in the functioning of nucleotide synthesis pathway (RNR1, coded by RRM1 gene) can modulate the efficiency of cellular DNA repair mechanisms and influence the risk of occurrence and severity of OM in HNC patients after RTH.
OBJECTIVE: The objective of this study was to evaluate the correlation between expression of RRM1 gene measured in free circulating RNA (cfRNA) and the risk of more severe OM and disease-free survival (DFS) and overall survival (OS) in patients undergoing RTH for HNC.
METHODS: The study included 60 patients treated with RTH for HNC. RRM1 gene expression was examined in circulating RNA isolated from peripheral blood plasma (before treatment).
RESULTS: High RRM1 gene expression was significantly associated with higher risk of grade 3 OM after 5 (OR = 4.97), 6 (OR = 4.33) and 7 (OR = 3.50) weeks of RTH. Expression of RRM1 gene was not significantly related with risk of DFS and OS shortening (however well separated Kaplan-Meier curves might suggest its potential prognostic impact).
CONCLUSIONS: The evaluation of RRM1 gene expression in cfRNA allows for estimation of the risk of severe OM in patients subjected to RTH.

PMID: 29865035 [PubMed - indexed for MEDLINE]


          South Africa Gross $Gold & Forex Reserve fell from previous $50.394B to $50.166B in October      Cache   Translate Page      

          South Africa Net $Gold & Forex Reserve dipped from previous $42.227B to $42.194B in October      Cache   Translate Page      

          GBP/JPY Technical Analysis: Minor pullback likely      Cache   Translate Page      
  • The GBP/JPY is currently trading at 148.65, having clocked a high of 148.90 earlier today.
  • The hourly chart is showing a bearish divergence of the relative strength index (RSI) and stochastic. Further, the MACD turned bearish a few minutes before press time. The 5-period and 10-period SMA on the hourly are beginning to roll over in favor of the bears.
  • As a result, the JPY cross could revisit 148.00 (psychological hurdle) in the next few hours.
  • A strong bounce from the ascending 50-hour exponential moving average (EMA) could reinforce a bullish view put forward by the rising 5-day and 10-day EMAs.

Hourly Chart

Trend: Intraday bearish

GBP/JPY

Overview:
    Last Price: 148.68
    Daily change: 7.0 pips
    Daily change: 0.0471%
    Daily Open: 148.61
Trends:
    Daily SMA20: 146.3
    Daily SMA50: 146.49
    Daily SMA100: 145.72
    Daily SMA200: 147.48
Levels:
    Daily High: 148.65
    Daily Low: 147.3
    Weekly High: 147.26
    Weekly Low: 143.22
    Monthly High: 149.52
    Monthly Low: 142.78
    Daily Fibonacci 38.2%: 148.13
    Daily Fibonacci 61.8%: 147.82
    Daily Pivot Point S1: 147.73
    Daily Pivot Point S2: 146.85
    Daily Pivot Point S3: 146.39
    Daily Pivot Point R1: 149.07
    Daily Pivot Point R2: 149.53
    Daily Pivot Point R3: 150.41

 


          Japan Coincident Index: 114.6 (September) vs previous 116.7      Cache   Translate Page      

          New Zealand RBNZ Inflation Expectations (QoQ) remains unchanged at 2% in 4Q      Cache   Translate Page      

          Japan Leading Economic Index declined to 103.9 in September from previous 104.5      Cache   Translate Page      

          Breaking News: EUR/USD challenges highs as Democrats clinch the House      Cache   Translate Page      

Democrats are emerging as winners in the House as a dramatic night in US politics unfolds. More and more networks are calling the race for Democrats. This includes Fox, NBC, CNN, and FiveThirtyEight. 

Democrats were favorites to win the lower chamber of the US Congress, but Republicans seemed to have significant chances during an extended period as votes were counted. 

The US Dollar is retreating across the board, and the EUR/USD is advancing towards the highs once again. The high so far is 1.1473. 

The GBP/USD is also moving higher, trading around 1.3140. The USD/JPY is on the back foot around 113.25. The AUD/USD is around 0.7260, the USD/CAD at 1.3110. 

Here is how the drama unfolded on the EUR/USD 15-minute chart. Click to see a live graph:

EUR USD Mid Terms

 

Live Coverage: Follow all the updates for the elections and currency movements 

Fox News was the first to call the House for Democrats. NBC followed. Other networks are moving more slowly. Earlier, networks called the Senate for Republicans after clear victories for incumbents and challenges of the majority party in the upper house of Congress.


          Global Stamping Friends - Circle Themed!      Cache   Translate Page      
Happy Friday everyone!  I am so very excited to leave for Stampin' Up Onstage in Orlando this weekend but first it's time for another round of the Global Stamping Friends hop!
Our hop theme is all about circles this month!  Circles can be added to any project in a multitude of ways but for my project, I knew I needed to use our wonderful circle punches!  We have a great variety of punch sizes to choose from but I focused on the 1 1/2" and 1 3/4" sized punches and the new Happiness Surrounds stamp set from the Snowflake Showcase!
I started my project off with a Blackberry Bliss card stock base and layered using Gray Granite card stock.  Next I used the Subtle embossing folder and embossed a piece of Petal Pink paper.
For the circles, I used the 1 3/4" punch and Blackberry Bliss card stock and using the 1 1/2" punch, I used the Frosted Floral Specialty Designer Series Paper and layered the circles together.  Using the Happiness Surrounds stamp set, I used Blackberry Bliss ink on Very Vanilla card stock and finished with an Artisan Pearl.

Make sure to continue on to the next blog in our hop by checking out Lisa Ann Bernard's blog for some more circle themed projects by clicking HERE or the next button below!


GSF Blog Hop Participants November 2018


My Project Supply List!

Contact your local Demonstrator to place your order!  Don't have a Demonstrator?  Click any photo below to connect to my online store and shop 24/7!


          Azure Sales and Marketing - Ingram Micro Cloud - Bellevue, WA      Cache   Translate Page      
Ingram Micro Inc. If so, join the Ingram Micro Cloud team - where rainmakers thrive. Act as a marketing resource and mentor for Ingram Micro Cloud sales teams...
From Ingram Micro Cloud - Fri, 28 Sep 2018 07:14:09 GMT - View all Bellevue, WA jobs
          Application of Kuhnt–Szymanowski Procedure to Lower Eyelid Margin Defect after Tumor Resection      Cache   Translate Page      
imageBackground: Lower eyelid reconstruction after tumor removal is always challenging, and full-thickness defects beyond half of the eyelid length require a flap from a part other than the remaining lower eyelid, such as the temporal area or the cheek. Objective: We aimed to report our experience of applying Smith-modified Kuhnt–Szymanowski, one of the most popular procedures for paralytic ectropion, for reconstructing oblong full-thickness lower eyelid margin defect. Materials and Methods: We performed Smith-modified Kuhnt–Szymanowski on 5 cases of oblong full-thickness lower eyelid margin defect after skin cancer removal. The mean age of patients was 80.0 years. The horizontal widths of the defects ranged from half to two-thirds of the lower eyelid length and the vertical width ranged from 5 to 9 mm. Results: We obtained good functional and esthetic results in all cases. No patients developed ectropion or lower eyelid distortion, and all patients were satisfied with their results. Conclusions: We utilized the procedure for morphological revision as a reconstructive procedure for eyelid margin defect by considering the defect as a morphological deformity of the eyelid margin; thus, donor tissue was not required to fill the defect and we could accomplish the reconstruction simply, firmly, and less invasively.
          28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas      Cache   Translate Page      

paintingnature.club posted a photo:

28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas

via WordPress ift.tt/2qvAqqa

LAKEPORT — So abounding accouchement and nary a chatter out of any of them! Saturday’s Children’s Art Day at the actual Courthouse Museum Esplanade in city Lakeport was awash with kids apperception on watercolor painting, button-making, bedrock painting, beans and rice mosaics, blow art, authoritative atom necklaces and authoritative cardboard bowl chickens and cardboard bag owls.

28 Easy Watercolor Painting Ideas for Beginners | artwork … – watercolor painting ideas | watercolor painting ideas

Organized by Charlie Adams, 17, for his All-Star Program in the Blue Heron 4-H Club. Adams accustomed allotment from the 4-H Council for materials, Lake County Arts Council [LCAC] provided some art supplies, and tables and chairs were supplied by the Scott’s Valley Women’s Club.

“I was allotment of a accumulation aggravating to advance 4-H” said Adams, “and we noticed that best bodies accessory 4-H with agronomics and animals at fairs. I created this accident with the ambition of assuming 4-H in a altered light. I chose to advance art.”

Robin Adams, 15, supervised the circuitous table forth with her mother Sue. Accouchement angled over their designs that they had alert assimilate a area of architecture paper. Using black rice and beans they fabricated all sorts of patterns and aback finished, they put their mosaics in the sun to dry.

Watercolor painting ideas for kids Archives – sparklingbuds – watercolor painting ideas | watercolor painting ideas

One minute the bedrock table was abandoned and the abutting it was abounding of kids bond acrylic colors and dabbing the colors assimilate their rocks. Barbara Clark, the controlling administrator of LCAC, said, “We ran out of rocks aboriginal and so I had to go on a bedrock run.” When asked area she got the rocks, she said, “I went bottomward to a brook bed and got bland rocks, absolute for painting and now we’re acceptable to go!”

Clark went on to say, “I was absolutely animated aback Charlie accomplished out to us [LCAC] to accomplice with him in this event. It’s been abundant watching all the kids do the altered crafts and watching our agents from the Main Street Gallery advice advise techniques and procedures. I anticipate we charge to do added being like this because art is a abundant way for kids to apprentice and if they abide on with art in their activity it can absolutely advice them accurate affections constructively.”

Carmon Brittian, an LCAC artist, wore a wild, crazy, bright brace of earrings that she had bought at a barn auction for 50-cents. She fit appropriate in to the Courthouse Museum art scene. Watching over the Bedrock Painting she accomplished the youngsters how to mix colors and acrylic the rocks. Little 4-year-old Keira Blanton corrective the top of her bedrock azure and again flitted off like a butterfly to addition table.

Famous Watercolor Painting Ideas Famous Abstract Watercolor Painting … – watercolor painting ideas | watercolor painting ideas

At the button authoritative table, Kelsey Robinson asked anniversary button maker to angle their anatomy to accomplish abiding they could columnist the button punch. One babe alone capital to appearance off her anatomy and cull the columnist down. She didn’t appetite her button so she gave it to a woman in the crowd, who admired it.

Nicholas, 3 and-a-half years old, was one of the youngest accouchement at the event. His mother, Melanie Roberts helped him draw his antecedent ‘N’ assimilate the Jack-O-Lantern paper. Under the adumbration of copse of the park, the acclimate was adequate with a slight breeze. Nicolas, accomplished with the attic button, ran around, got a alcohol of baptize on his own and apparent the park’s advocate statue. “Look a fireman!” he shouted to his mom, who smiled. “It’s nice to accept kid-friendly contest in the outdoors,” she said.

The Blow Art table was manned by Charlie Adams. One babe sat bottomward and drew a brace of shapes. “I’m done,” she declared and again ran off to the watercolor table. No bulk of adulation could accompany her aback to adorn her aboveboard and triangle shapes.

Watercolor Ideas Watercolor Painting Ideas You Can Do At Home … – watercolor painting ideas | watercolor painting ideas

At one of the best accepted tables, accouchement were active authoritative Fruit Loop necklaces. Painstakingly they threaded yarn with the bright cereal. One boy said he was authoritative a snake. The two Luhsinger sisters Clara and Aliana aloof looked at him and went aback to their chaplet making. Kaylee Robinson, 6, so focused on her conception didn’t alike attending up. Ophelia Harry-Rose, 5, absolved about bistro her necklace, as did best of the kids. Aback asked what it tasted like, both Ophelia and her mother Shanda Harry, chimed, “Sugar!”

Children at the watercoloring table were accustomed instructions by artist/art adviser Diana Liebe and Richard Schmidt, artisan and Lake County Poet Laureate. Liebe conducts alpha watercolor workshops at the Main Street Gallery on Wednesdays at 1P.M. Talking about the children’s art amused her. She laughed aback she said, “I get such a bang out of kids. Their imaginations are astonishing and they appear up with the best absorbing account and colors and they’re so free. It’s so abundant fun to assignment with kids. I adulation kids.”

Marisol Carrando, 12, corrective brilliant-colored leaves she had traced from broiled leaves provided by Liebe. After agreement her artwork in the sun to dry, she went to several added art tables and again anguish aback to the watercolor table to do a additional painting. Jules Showalter, 12, and her brother Seth, 9, corrective the leaves they additionally had traced. “I like how the colors blend,” said Jules. Liebe showed her how to do the blending, which produced a different bright assignment of art.

Easy Watercolor Ideas for Any Skill Level – watercolor painting ideas | watercolor painting ideas

When asked what he had hoped to accomplish from acclimation Children’s Art Day as his All-Star 4-H project, Charlie said, “I capital bodies to see that 4-H can be about whatever accomplishment or affection they accept and appetite to share. They can alike actualize a accessible accident for kids! Something they can bethink and maybe they ability accompany 4-H, too, and abide the aeon of ‘making the best better’ for our community, our country and our world, which is our 4-H adage and allotment of our pledge.”

28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas – watercolor painting ideas
| Pleasant in order to my own website, with this time I am going to teach you concerning keyword. And today, this can be a 1st photograph:

Recreation Therapy Ideas: Watercolor Trees – watercolor painting ideas | watercolor painting ideas

How about picture above? is usually that remarkable???. if you feel consequently, I’l m show you a number of graphic yet again beneath:

So, if you desire to have all these wonderful pics related to (28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas), click save link to store the images for your personal pc. They are ready for download, if you’d prefer and wish to take it, simply click save badge in the page, and it will be instantly downloaded to your laptop.} Finally if you desire to find unique and recent graphic related with (28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas), please follow us on google plus or bookmark the site, we attempt our best to offer you regular update with all new and fresh photos. We do hope you like staying right here. For some up-dates and latest information about (28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas) photos, please kindly follow us on tweets, path, Instagram and google plus, or you mark this page on bookmark area, We try to offer you up grade periodically with fresh and new shots, love your searching, and find the ideal for you.

Here you are at our website, contentabove (28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas) published . At this time we’re pleased to declare that we have found a veryinteresting nicheto be pointed out, namely (28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas) Lots of people attempting to find info about(28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas) and definitely one of them is you, is not it?

Watercolor Technique For Trying Ideas With Tracing Paper – watercolor painting ideas | watercolor painting ideas

Pale Poppies – watercolor painting ideas | watercolor painting ideas

Watercolor Eagle Painting by Alison Fennell – watercolor painting ideas | watercolor painting ideas

28 Easy Watercolor Painting Ideas – How Wee Learn – watercolor painting ideas | watercolor painting ideas

Watercolor Ideas for Beginners Beautiful 28 Easy Watercolor … – watercolor painting ideas | watercolor painting ideas

Dust & Dirt Overlay Textures in 2018 | Texture | Pinterest … – watercolor painting ideas | watercolor painting ideas

28 Wonderful Watercolor Painting Ideas That Your Kids Will Enjoy … – watercolor painting ideas | watercolor painting ideas

watercolor painting ideas.. — Steemkr – watercolor painting ideas | watercolor painting ideas

28 Easy Watercolor Painting Ideas for Spring – JerrysArtarama.com – watercolor painting ideas | watercolor painting ideas

Oil Pastel Tutorial Inspired by Monet's Water Lillies … – watercolor painting ideas | watercolor painting ideas

19 Incredibly Beautiful Watercolor Painting Ideas … – watercolor painting ideas | watercolor painting ideas

19 Incredibly Beautiful Watercolor Painting Ideas … – watercolor painting ideas | watercolor painting ideas

28 Incredibly Beautiful Watercolor Painting Ideas – Homesthetics … – watercolor painting ideas | watercolor painting ideas

19 Incredibly Beautiful Watercolor Painting Ideas … – watercolor painting ideas | watercolor painting ideas

Watercolor Painting Ideas · The Typical Mom – watercolor painting ideas | watercolor painting ideas

THE DANCE OF THE LIGHT. VIEWS OF THE BACK PATIO FROM HOME. – watercolor painting ideas | watercolor painting ideas

28 Wonderful Watercolor Painting Ideas That Your Kids Will Enjoy – watercolor painting ideas | watercolor painting ideas

28 Easy Watercolor Painting Ideas for Beginners | Art of watercolor … – watercolor painting ideas | watercolor painting ideas

28 Watercolor Painting Ideas You Can Do At Home – Useful DIY Projects – watercolor painting ideas | watercolor painting ideas

19 Incredibly Beautiful Watercolor Painting Ideas … – watercolor painting ideas | watercolor painting ideas

19 Incredibly Beautiful Watercolor Painting Ideas … – watercolor painting ideas | watercolor painting ideas

Oxford Morning After the Ice – watercolor painting ideas | watercolor painting ideas

The post 28 Great Lessons You Can Learn From Watercolor Painting Ideas | watercolor painting ideas appeared first on Painting Nature.


          Moovit will bring public transport data to Microsoft Azure Maps      Cache   Translate Page      
none
           340亿元收购红帽是IBM在云计算领域最后的救命稻草       Cache   Translate Page      

340 亿美元收购开源软件商红帽 RedHat 是 IBM 这家百年老店 107 年历史上最大的一笔收购,也是有史以来规模最大的一笔软件领域的交易。IBM 这个如同赌博一样的收购除了令人咂舌的价格之外,并不会让业界觉得意外,因为这项收购对于红帽来说,自己可以从容选择一家更适合自己的伙伴,而红帽可能是 IBM 在云计算领域最后的救命稻草了。

据外媒获得的线报,IBM 对红帽的收购相当有诚意。

实际上,不仅仅是 IBM,其它诸如甲骨文、思科等没有跟上云计算时代脚步的传统 IT 公司,以及在云计算领域艰难竞争的新兴科技公司 (比如 Google),都曾有意收购红帽,而这一批「符合逻辑」硅谷公司在过去几年已经和红帽开始接触并举行了初步谈判,但红帽最终意外也选择了 IBM,其它竞争对手相比,IBM 的出价并不是最高的,但红帽 CEO 吉姆·怀特赫斯特 (Jim Whitehurst) 对媒体表示没有一家公司的谈判态度能和 IBM 相比。

据悉让红帽打动的是 IBM 是一个安全的买家,IBM 承诺红帽公司未来将继续在 IBM 内部独立运营,并且不会收到新的母公司的干扰,而且也不会明显改变现阶段的竞争格局。在云计算上花费 340 亿美元并不能使 IBM 一举超过亚马逊成为举足轻重的云计算巨头,也不能像 Salesforce 或者 Twilio 那样成为该领域颇具新鲜感的未来力量。IBM 收购红帽有自己的逻辑。

最直接的是收购红帽将可能改善 IBM 长期以来一直处于低迷的财务状况。IBM 在过去六年一直处于营收下滑的状态,虽然在之前连续三个季度有势头不错的增长让外界看到复兴的希望,但在前几天 IBM 公布的 2018 年第三季度财报显示营收再度下滑,当季营收 188 亿美元,不及华尔街预期的 191 亿美元,再加上美股持续暴跌,IBM 的股价也非常凄惨。

实际上,红帽的营收规模和 IBM 相比并不大,但增长速度惊人,并在此过程中产生现金流,而且重要的是红帽可以带给百年巨人「新的故事」可以讲,可以向华尔街证明自己有转型的决心来换取投资人重获信心。

#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

实际上,最近一年红帽本身的业绩也颇有压力,在上个财季红帽的销售成绩低于分析师预期,而且净利润下滑速度也很明显,而且预测本季度也将出现下滑,让外界质疑红帽的增长前景和获得合同的能力。尽管该公司对外界表示公司的营收将「触底反弹」,全年营收也铁定超过 30 亿美元,但其股价在过去的六个月里下跌了 28%,红帽需要做出一些调整。

#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

在更坏的结果出现之前,双方的合并可能是一个更好的选择。双方合并后 IBM 将给红帽提供更大的客户群,吉姆·怀特赫斯特表示「与 IBM 合作将为红帽提供更高水平的规模、资源和能力,并将为 Red Hat 带来更广泛的客户群。」华尔街分析师预计合并后 IBM 今年将出现微弱的增长,在 2019 年可能到来的经济再次萎缩之前,本财年和下财年,IBM 至少将增长 15%。IBM 预计双方合并后的营收将增长 2%,每股收益(不包括某些项目)将在两年后实现增长。

混合云的竞争可能是个拐点

收购完成后,Red Hat 将被并入 IBM 的混合云部门。就纯公有云而言,亚马逊、微软、Google 已经占据了绝对的市场份额,IBM 没有任何机会,而混合云则是 IBM 重新切回云计算主赛道的希望。

红帽的传统业务是围绕 Linux 开源操作系统做生意。IBM 虽然也有一个与亚马逊和微软竞争的公有云产品,但开发人员在许多公共云上更广泛使用红帽的 Linux 操作系统,包括微软 Azure 和 Google Cloud。尽管越来越多的公司选择使用了 IBM 公有云业务的主要竞争对手,但这种多云方式应该有助于 IBM 带来收入。

IBM 公司董事长,总裁兼首席执行官 Ginni Rometty 表示购买 Red Hat 将帮助公司成为混合云市场的领导者,「是一个改变云市场游戏规则的方式。」「IBM 将成为全球排名第一的混合云提供商,为企业提供唯一的公有云的解决方案。」

众所周知,混合云是现在最受欢迎的上云方式,一些大企业或者组织更愿意选择这种方式来确保公司的核心数据不被外泄,尽管云计算公司多次郑重强调企业的数据在云上绝对安全,而且越来越多的企业组织将尝试采用多云方法,他们喜欢利用来自不同云计算提供商 (如亚马逊和微软Azure) 的云服务组合。

#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

IDC 预测,未来混合云将占据整个云市场份额的 67%。Gartner 预测,到 2020 年,90%的组织将利用混合云管理基础设施。而根据 MarketsandMarkets 的调查,到 2021 年,混合云市场规模预计将增长到 917.4 亿美元。

业界预计混合云解决方案的强劲发展势头将会加强企业对多云的采用。企业采用公共云用来补充现有的本地部署数据中心和私有云基础架构的不足。

Ginni Rometty 表示如今大多数公司与云计算相关的业务只有 20%,它们利用上云来削减成本。然而接下来的 80% 将会是云计算的下一章,它们会通过云计算来释放更多的实际业务并推动增长,需要将自己的业务迁移到混合云,并通过数据计算来优化企业业务的各个部分。

另外,Rometty 说,购买红帽意味着要让 800 多万软件开发人员参与进来,这可能会让他们更接近 IBM 的其他产品。还有机会为 IBM 提供更多的咨询服务。根据野村证券的报告,帮助部署红帽产品 (如 JBoss 中间件和 OpenShift 软件以在虚拟容器中部署应用程序) 可能都将归属于 IBM 的咨询和托管服务业务。

341亿美元的乌云?

这笔交易可能会帮助销售传统数据中心硬件的 IBM 改善其在仍在自己设备上运行应用程序的公司中的定位。

电池风投的合伙人 Dharmesh Thakker 在接受媒体采访时表示,撇开对公共云计算的支出不谈,「剩下的 90% 的数据中心支出仍在惠普、IBM、戴尔和思科上。」「IBM 希望通过将一项关键资产作为混合资产的一部分,他们能把自己描绘成这群人中最好的。」

在最新的一笔巨额交易中,IBM 面临着很大的风险。许多大型科技 IT 科技收购都以失败告终,尤其是惠普 (Hewlett-Packard) 以 250 亿美元收购康柏 (Compaq),微软以 72 亿美元收购诺基亚 (Nokia) 的设备和服务业务,以及 Google 以 125 亿美元收购摩托罗拉移动 (Motorola Mobility)。

正如坎托·菲茨杰拉德 (Cantor Fitzgerald) 分析师周一在一份报告中所写,「这笔交易能否成功,将取决于它在交叉销售/整合方面的执行情况。」

尽管目前并没有一份公认的混合云市场份额报告,但业界普遍认为如果公有云市场当前的情形,亚马逊和微软占据了绝对主力的地位。而且混合云虽然是当前的热门产业,但已经有众多竞争对手涌入这个行业,IBM 现在几乎没有任何先入为主的优势,只能依靠双方优势资源的技术整合和销售人员巧舌如簧的技巧来获取更多的客户。

#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

我们也不得不怀疑 IBM 在未来混合云方面的策略是否会进一步重蹈 Watson 项目的覆辙。IBM 公司自 2006 年开始研发Watson,不久 Watson 就已经成为了 IBM 的核心方向,但随着时间的推移,与 Watson 相关的不是技术进展而是各种负面。

但有业界人士直言 IBM 的 Watson 实际上是个「咨询业务」。在该业务的开展过程中,IBM 与企业客户签订高价合同,将 Watson 技术应用于特定业务场景。然而不幸的是,IBM 很难让该公司的技术真正的满足客户需求。这导致了 Watson 进一步失去信心,最具代表性的事件就是「股神」沃伦·巴菲特表示对 IBM 业务失去信心,并对外界表示已经抛售了其持有约三分之一的 IBM 股票,随之而来的 IBM 的营业额和股价的暴跌。

和红帽的整合无疑让 IBM 在混合云方面更具有竞争水平,但 IBM 需要用更成功的案例去证明自己可以为用户带来符合其付出巨额合同所带来的价值,从而进一步赢得市场的信心。纳德拉在 2018 年第二季度分析师电话会议上表示,当微软云用户在使用的过程中表示满意,他们同样会选择微软混合云产品,这似乎也是一件很自然的事情。但是又有多少用户会表示红帽操作系统很好用,就会去选择 IBM 的混合云业务呢?这两者之间的差距看起来似乎有些大。

如果这项 340 亿美元的交易整合失败,IBM 这家百年老店未来会怎样?

*本文作者Chaos


          Microsoft Azure Machine Learning and Project Brainwave – Intel Chip Chat – Episode 610      Cache   Translate Page      
In this Intel Chip Chat audio podcast with Allyson Klein: In this interview from Microsoft Ignite, Dr. Ted Way, Senior Program Manager for Microsoft, stops by to talk about Microsoft Azure Machine Learning, an end-to-end, enterprise grade data science platform. Microsoft takes a holistic approach to machine learning and artificial intelligence, by developing and deploying [...]
          Conseiller en Intégration d’Infrastructures Infonuagiques et Centre de Données - Nexio - Montréal, QC      Cache   Translate Page      
AD, ADFS, AD Connect, Azure AD, AWS AD, AWS IAM, AWS simple AD, IDaaS; Nexio est à la recherche d'un(e) Conseiller(ère) en Intégration d’Infrastructures...
From Nexio - Tue, 06 Nov 2018 19:28:28 GMT - View all Montréal, QC jobs
          Intégrateur d'infrastructures infonuagiques - CGI Group, Inc. - Montréal, QC      Cache   Translate Page      
AD, ADFS, AD Connect, Azure AD, AWS AD, AWS IAM, AWS simple AD, IDaaS; Nous sommes à la recherche d'un conseiller en intégration d'infrastructures infonuagiques...
From CGI - Tue, 06 Nov 2018 00:15:41 GMT - View all Montréal, QC jobs
          Conseiller en intégration d'infrastructures infonuagiques et centre de données sur site - ACCEO Solutions - Montréal, QC      Cache   Translate Page      
Installer, configurer et intégrer des composantes et services GIA (Gestion des identités et des accès), AD, ADFS, AD Connect, Azure AD, AWS AD, AWS IAM, AWS...
From ACCEO Solutions - Fri, 02 Nov 2018 18:00:05 GMT - View all Montréal, QC jobs
          Intégrateur d'infrastructures infonuagiques et centre de données sur site - ACCEO Solutions - Montréal, QC      Cache   Translate Page      
AD, ADFS, AD Connect, Azure AD, AWS AD, AWS IAM, AWS simple AD, IDaaS; ACCEO Solutions inc. est à la recherche d'un conseiller en intégration d’infrastructures...
From Indeed - Fri, 02 Nov 2018 15:14:17 GMT - View all Montréal, QC jobs
          Hunter Azure Has Big Opportunity in LFA and in New Weight Class      Cache   Translate Page      

Following a win in his pro debut in August of 2017, up and comer Hunter Azure made the decision to relocate from Salt Lake City, Utah, to Phoenix, Arizona, and since then has had a successful 2018.

Since joining The MMA Lab, Azure has kept his undefeated streak going at 145 pounds, and now intends to transition that success to a new weight to close out 2018.

“I’ve had two pro fights with The Lab now, going into my third, and it’s been a great transition,” Azure told MMAWeekly.com. “(All three of my pro fights have) been at 145 pounds. This one I’m going down to 135 poundsfor the first time, so I’m excited about that.

“At 145 pounds I was feeling a little small out there, and was facing some big dudes. The Lab has got me healthy now; my weight is down; and I feel the strongest I’ve been now.”

According to Azure, the level of competition he sees on a daily basis at The Lab has forced him to evolve quickly to survive, and it’s made him much better in the process.

“The room is full of killers every day,” said Azure. “I have to show up with my A-game on a daily basis.

“I’m watching all these great UFC and Bellator fighters, and pick their brain about how they train, and I’ve been adding that to how I train. Seeing how they train and how they prepare for fights has helped me. I just feel like I’m so green in the sport right now, so I’m just soaking it all in and loving it.”

On Friday in Phoenix, Arizona, Azure (3-0) will look to have a successful debut in a new promotion and weight class when he faces AJ Robb (1-1) in a preliminary 135-pound bout at LFA 53.

TRENDING > Anthony Johnson’s Manager Says He Will Return to UFC: ‘100 Percent’

“It’s going to be my first fight down at 135 pounds, so I have to be really disciplined at this time,” Azure said. “This is a big opportunity for me on the LFA card.

“The biggest thing for me is to just be me. My cardio is always there. I like to grind. I’ve always been a grinder with cardio. Switching over to the Lab my overall game has gotten better. If I just go out and be me I’ll have a finish in this next fight.”

With an opportunity to gain national exposure in the LFA, Azure is looking to use his opportunity to help get him to the next level one way or another in 2019.

“I take things fight by fight but I’d like to break into a big show in 2019 by the end of the year,” said Azure. “I’d like to maybe get into (Dana White’s) Contender Series by the summer or get that call to the UFC. After this fight I’ll be 4-0, and get a few more fights and get some more cage time under me.”


          Data & BI Manager Bath      Cache   Translate Page      
Opus Recruitment Solutions - Bath - Data & BI Manager | Bath | £55,000 - £70,000 Business Intelligence | BI | Data Warehouse | SQL Server | SSIS | SSRS | SSAS | Azure Want...
          Analyzing expense receipts with Azure Cognitive Services and Microsoft Flow      Cache   Translate Page      
Recently, Business Applications MVP Steve Endow and I delivered a session at the User Group Summits in Phoenix, and in particular, to the GPUG Summit titled, "Microsoft Dynamics GP and Azure Services". In this course we detailed a number of Azure Services (among the hundreds) that could potentially be used with Microsoft Dynamics GP.

Being that I have also been working my way through completing a Microsoft sanctioned PowerApps, Flow, and CDS self-paced, online training class offered by edX (click here for more info on the course) and presented by Business Applications MVP Shane Young, I asked myself, what could I do with Microsoft Flow and Azure Services that could benefit Microsoft Dynamics GP users?

Playing with some Azure Services, I came across Azure Cognitive Services which offers the capability of reading images and analyzing them for text via its Computer Vision service. As it turns out, this service offers an optical character recognition (OCR) feature, which is capable of returning the full text or a JSON formatted document with the text. The idea here would be to use Microsoft Flow to read a newly created receipt in a OneDrive folder and transfer the file to Cognitive Services' Computer Vision for analysis, then get back the parsed text from the OCR service. 

Let's see how it's done!

Provision the Computer Vision service

The first thing is to ensure Computer Vision has been enabled as a service on your Azure tenant. For this visit the Azure Portal, click on Create a Resource, then select Computer Vision from the AI + Machine Learning category within the Azure Marketplace. 

Computer Vision

Fill in some basic information like the resource name, location, pricing tier (there's a F0 free service!), and a resource group. Click the Create button when done. This will provision your Computer Vision resource.
 
Copy the service endpoint address and access keys

Once the service is provisioned, click on All Resources, select the Computer Vision resource you just created, then click on Overview.

Grab the service endpoint address and the access keys. You can obtain the access keys by clicking on Show access keys.. (two access keys are provided). 

Computer Vision service endpoint info

This is, by far, one of the easiest services to provision and requires no extra configuration, beyond establishing some access control to limit who can use the service, but that's not a topic for this discussion.

Setup a new Microsoft Flow flow

Over in Microsoft Flow, I started by setting up a blank flow and selected OneDrive's "When a file is created trigger" as this would setup the simple trigger point for when an expense receipt file is added to a Receipts folder I had previously created. You will then be prompted to setup the connection information for OneDrive.

Blank flow with "When a file is created" trigger

NOTE: I selected my personal OneDrive for this, but this can also be done with a folder on your OneDrive for Business environment. In this case, you will want to authenticate with your Office 365 credentials.
Receipts folder


Submit file to Computer Vision service

As it also turns out, Microsoft Flow has a connector to the Azure Computer Vision API, which exposes two actions: OCR to JSON and OCR to Text. Add a New Step and type Computer Vision in the search bar. Select Computer Vision API, then choose OCR to Text action.

Computer Vision API connector - OCR to Text action

Once again, you will be prompted for the connection information to your Computer Vision service on Azure. Enter the user account, the access key and service endpoint as gathered in step two, above.

Computer Vision API - credentials entry
Once credentials are entered, you can decide what to submit to Computer Vision. In this case, we what to send the File Content, which we can select from Dynamic content fields.

File Content from Dynamics content fields

Configure Email step with Results

Upon completion, we want to send the resulting OCR from the analyzed image via email, so we will add another step to the flow. This time, we will a connector to Office 365 Outlook and will choose the Send an Email action for our next step.
Office 365 Outlook connector - Send Email action

We can then setup the properties for the Send an Email step. I have chosen to send the email to myself, and compose a subject line using the File name from the OneDrive directory. As body, I am including the Detected Text, which can be selected from the OCR to Text category under Dynamic content. I've included both the original file and content as part of the attachments.



Finally, I have given this flow a name and saved it.

Testing the Flow

I have dropped two receipt files within my OneDrive Receipts folder. These two receipts present various degrees of quality and text that can be recognized by the service. I was particularly interested in the second receipt (right) as this one was very folded and cracked so I was curious to see how it would be analyzed.

Receipts
For the second receipt, the OCR service returned the JSON payload and a status 200, indicating it was successful in processing and delivering a response.

JSON payload for second receipt

The actual email I received look like this and contained the following text:

Receipt analysis

Now, remember that my goal isn't to judge the accuracy of the OCR result delivered by Computer Vision, but rather to show how easy it is to build these kinds of solutions with Microsoft Flow and existing Azure services. Something like this would take an inordinate amount of time to build using traditional development tools and services.

Conceivably, I could create a simple PowerApps application that uses the Camera control to take the picture of the receipt and save it to my OneDrive folder. At this point, the receipt would be picked up by the Flow and analyzed by Computer Vision as we have established here. Why would this be important? Perhaps if you want to parse the JSON payload and rather submit to Microsoft Dynamics GP or Dynamics 365 as an account payables voucher, this would be useful.

Until next post,

MG.-
Mariano Gomez, MVP


          Azure Sales and Marketing - Ingram Micro Cloud - Bellevue, WA      Cache   Translate Page      
If so, join the Ingram Micro Cloud team - where rainmakers thrive. Are you an innovative, self-starting Marketing guru who loves helping IT providers design and...
From Ingram Micro Cloud - Fri, 28 Sep 2018 07:14:09 GMT - View all Bellevue, WA jobs
          Conseiller en intégration d'infrastructures infonuagiques et centre de données sur site - ACCEO Solutions - Montréal, QC      Cache   Translate Page      
Bonne connaissance dans l’intégration des outils de gestion de services TI et de surveillance tels que Easy Vista, AWS Cloudwatch, AWS Config, Azure Portal et...
From ACCEO Solutions - Fri, 02 Nov 2018 18:00:05 GMT - View all Montréal, QC jobs
          Intégrateur d'infrastructures infonuagiques et centre de données sur site - ACCEO Solutions - Montréal, QC      Cache   Translate Page      
Bonne connaissance dans l’intégration des outils de gestion de services TI et de surveillance tels que Easy Vista, AWS Cloudwatch, AWS Config, Azure Portal et...
From Indeed - Fri, 02 Nov 2018 15:14:17 GMT - View all Montréal, QC jobs
          Web Developer- Regina - Accion Labs - Regina, SK      Cache   Translate Page      
Experience with Azure App Services (Web, API, Function). Role- Web Developer....
From Indeed - Tue, 25 Sep 2018 01:55:26 GMT - View all Regina, SK jobs
          Tech Roundup - 6th November 2018      Cache   Translate Page      
Stuff collated since Tech Roundup - 23rd September 2018. With headings:
Cisco (FlexPod), CompTIA (and Cybersecurity), Flackbox, Industry Commentary, Microsoft, NetApp, Veeam, VMware

Cisco (FlexPod)

FlexPod Datacenter with Cisco ACI Multi-Pod, NetApp MetroCluster IP, and VMware vSphere 6.7 Design Guide

FlexPod Datacenter with Cisco ACI Multi-Pod, NetApp MetroCluster IP, and VMware vSphere 6.7 Deployment Guide

CompTIA (and Cybersecurity)

Cool Jobs: Using Cybersecurity to Protect Nuclear Power Plants

Cybersecurity Careers: Learn More About Penetration Testing

Cybersecurity Certificates, Certifications and Degrees: How to Choose

CASP vs. CISSP: 4 Advantages of CompTIA’s Advanced Cybersecurity Certification

Flackbox

List of VSA Virtual Storage Appliances and SAN Storage Simulators

Industry Commentary

Six Reasons for Multi-Cloud Computing

IBM, Red Hat and Multi-Cloud Management: What It Means for IT Pros

Microsoft

Azure File Sync is now available to the public
*Posted on Tuesday, September 26, 2017

NetApp

General

NetApp Cloud API Documentation

Image: NetApp Cloud API Documentation

Cloud Volumes Services

NetApp Kubernetes Service Demo

Azure NetApp Files Demo

NetApp Cloud Volumes Service for AWS Demo

File Storage for AWS is Now Simpler and Faster

Discover How Data Creates Medical Breakthroughs

Transforming Medical Care With Data in the Cloud (Changing the World with Data)

DreamWorks Animation: Creating at the Speed of Imagination

Building Big Data Analytics Application on AWS with NetApp Cloud Volumes

Scaling Oracle Databases in the Cloud with NetApp Cloud Volumes

NetApp Cloud Volumes as a Persistent Storage Solution for Containers

New TRs


New NVAs (NetApp Verified Architectures)

VMware Private Cloud on NetApp HCI: NVA Design

Red Hat OpenShift Container Platform with NetApp HCI: NVA Design
Red Hat OpenShift Container Platform
The Easy Button for Delivering Better Experiences. Faster. With NetApp and Red Hat.

New Posts by Justin Parisi


New on Tech ONTAP Podcast (hosted by Justin Parisi)


New on ThePub

October 2: My Name is Rocky

New on wfaguy.com


Veeam

Veeam Backup & Replication: Quick Migration

VMware

Introducing Project Dimension

VMworld 2018: We’re Rethinking the Limits of Innovation

Taking Innovation to New and Unexpected Levels at VMworld 2018

What’s New in vSAN6.7 Update 1

What’s New in vRealize Operations 7.0

Building on the Success of Workspace ONE

Solution Brief: SD-WAN Simplified


          QA Analyst II - 262611 - Procom - Regina, SK      Cache   Translate Page      
3-5 experience developing and running test scripts using tool sets like Microsoft Test or Visual Studio or Azure DevOps Test Hub (i.e....
From Procom - Mon, 22 Oct 2018 15:13:32 GMT - View all Regina, SK jobs
          Warzecha i UPARTY, czyli czego chcą Niemcy?      Cache   Translate Page      
  Cztery dni temu [2.11.2018] Łukasz Warzecha opublikował w portalu Wp.pl tekst "Zimna wojna z Niemcami to głupota" {TUTAJ}, w którym stwierdził m.in:  "Poziom antyniemieckich emocji, buzujących w elektoracie partii rządzącej, dawno już wymknął się racjonalnym ocenom. Niemcy wyrosły na naszego największego przeciwnika, wroga, który chce naszego zniewolenia i pognębienia. I nie jest to tylko kwestia typowych internetowych emocji, bo swoją rolę odgrywają też bardzo prominentni politycy PiS – by wspomnieć tylko powtarzające się co jakiś czas tłity o "niemieckich mediach polskojęzycznych". (...) Rozjazd narracji dla elektoratu i praktyki poraża. Elektorat dostaje opowieść o strasznych Niemcach, streszczoną na początku. Praktyka jest taka, że Niemcy są dla nas najważniejszym partnerem handlowym, my dla nich – jednym z najważniejszych. W 2017 r. nasz eksport do Niemiec wyniósł ponad 242 mld złotych, import – ponad 183 mld. W niemieckim rankingu eksportu Polska jest (dane na 2016 rok) ósma, a w rankingu importu – szósta. (...) Polska debata momentami sięga dna i widać to bardzo wyraźnie właśnie w przypadku Niemiec. Miejsce analizy i rozpoznania problemów zajmują wulgarne pokrzykiwania, w których specjalizują się harcownicy władzy, tacy jak poseł Tarczyński, ale też coraz częściej ważniejsi funkcyjni, choćby Beata Mazurek, nawiązująca nieustannie do niemieckiej własności niektórych mediów. (...) Niemcy nie rozumieją naszej wrażliwości i sposobu myślenia.Od lat przywykli do kontaktów z jedną stroną politycznej sceny, z którą nie musieli się o nic spierać – która po prostu słuchała, co mają do powiedzenia, i kiwała głowami. Tę jednostronność kontaktów widać w wielu tekstach o Polsce, publikowanych w takich gazetach jak berliński "Der Tagesspiegel". (...) Nasi partnerzy nie dostają wyraźnego sygnału, że to tego typu kombinacja i nie wiedzą, jak interpretować kolejne antyniemieckie szarże polityków PiS. Co gorsza, obsesyjnie antyniemiecka retoryka wytworzyła już tak potężne oczekiwania wyborców, że zaczyna działać sprzężenie zwrotne: politycy, którzy być może traktowali swoje pokrzykiwania tylko jako środek retoryczny do zagrzewania elektoratu, będą musieli zacząć spełniać oczekiwania, które sami wytworzyli.A to może nas naprawdę wiele kosztować.".  Artykuł ten wywołał istną burze w Internecie {TUTAJ}.  Ponieważ Warzecha na krótko przed jego napisaniem odbył wycieczkę po Niemczech na koszt strony niemieckiej, internauci uznali iż został on po prostu kupiony przez Niemców.  Znany bloger Matka Kurka [Piotr Wielgucki] tak to ujął:  "Niemcy tanio kupili sobie polskiego dziennikarza, aby załatwić niemieckie interesy kosztem polskich, a ten dziennikarz jeszcze obrzucił gównem Polaków, którym zarzucił brak rozumu i przyzwoitości. Tak wygląda intelektualna prostytucja, ale mimo wszystko jeszcze raz przywołam tytułowe zdanie.Przypadek Warzechy jest mało szkodliwy i bardzo pożyteczny. Tym razem czuję się w obowiązku przedstawić argumenty do tezy. Warzecha i jego niemiecka przygoda, z końcową zapłatą, są w pełni jawne. Każdy może się z tą żałosną produkcją zapoznać i każdy wie, że artykuł Warzechy, w którym zawarł pochwały dla Niemiec i przestrogi dla Polski, jest wynikiem konkretnej umowy. Dzięki temu czarno na białym mogliśmy zobaczyć, że Niemcy takie produkty zamawiają i polscy dziennikarze je wykonują. Jest to bardzo pożyteczna informacja i w sumie pisanina Warzechy nie przynosi takich szkód, jakby mogła przynieść w innych warunkach. Jakich? Najbardziej niebezpieczne dla Polski są te umowy niemieckie, zawierane z polskimi redaktorami, o których nie wiemy nic, a jest takich umów bardzo wiele." {TUTAJ}.  Słowa Matki Kurki potwierdził na Twitterze Cezary Gmyz pisząc {TUTAJ}:  "Żeby była sprawa jasna - nie krytykuję @lkwarzecha za to, że odbył podróż studyjną do Niemiec za pięniądze @AuswaertigesAmt Sam brałem nieraz udział w podobnych programach. I sam podobne, również za niemieckie pieniądze dla niemieckich dziennikarzy organizowałem w Polsce.".  Ponad dwa tygodnie temu UPARTY zamieścił w Blogmedia24.pl tekst "Chyba jest trochę bezpieczniej" {TUTAJ}.  Jego przypuszczenia co do prawdziwych intencji Niemców są następujące:  "O ile nagonkę na Polskę rozumiałem, wiedziałem o co chodzi o tyle wreszcie zrozumiałem szczegóły planu.Wydaje mi się, że głównym celem była chęć odzyskania przez Niemcy zapewne na początek Dolnego Śląska w ten sposób, by przymusić nasz kraj do jego sprzedaży. Być może nie w całości i nie we wszystkich funkcjach administracyjnych od razu, może byłoby to działanie rozłożone na jakieś etapy, ale cel wydaje się jasny.Teraz, gdy wychodzi na jaw, że rząd PO świetnie wiedział o przyczynach i skali nieszczelności systemu fiskalnego można pokusić się o stwierdzenie, że chciał doprowadzić do bankructwa kraju zwłaszcza, że również świadomie swoją polityką pogłębiał nierówności społeczne, które z czasem musiałyby doprowadzić do eskalacji wydatków socjalnych. Stąd wściekłość na program 500+. Likwiduje on bowiem skrajną nędzę a więc i grupę ludzi, którzy z desperacji mogliby wszcząć zamieszki społeczne i przymusić władze do dalszego obciążenia budżetowego w wybranym przez prowokatorów momencie.Widać teraz, że czynnikiem rozstrzygającym bitwę o Polskę miały być zapewne długi samorządowe. Otóż jak się teraz okazuje, oficjalne zadłużenie samorządów to zaledwie ok. 1/3 zadłużenia ogólnego. Chyba 2/3 zadłużenia jest ukryte w długach komercyjnych spółek miejskich. Jest to identyczny zabieg statystyczny jak ten, który doprowadził do plajty Greków. (...) Jednak w pewnym momencie, gdy kończą się możliwości obsługi takiego długu zwiększa on deficyt budżetowy, praktycznie z dnia na dzień. (...)  Jeżeli w takiej sytuacji doszłoby do eskalacji słusznych zresztą żądań socjalnych musiałyby dojść do upadku rządu i praktycznej niemożności stworzenia nowego, bo w czasie niepokojów społecznych zorganizowanie skutecznych wyborów i dyskusja programu politycznego jest nie możliwa.(...)  Teraz wydaje się że chyba plan się zawalił. Po pierwsze rząd zlikwidował rażące dysproporcje w poziomie życia , po drugie poprawił sytuację budżetu państwa, po trzecie zwiększył dochodowość spółek skarbu państwa, po czwarte zmienił skład SN. Musimy jednak pamiętać, ze długi samorządowe dalej są i dalej nam zagrażają.Musimy też pamiętać o naszym Prezydencie. Pierwszą zawetowaną przez niego ustawą była ustawa pozwalająca wojewodzie porządkować zobowiązania samorządowe, drugą było opóźnienie reformy sądownictwa, ciekawe co jeszcze wymyśli.".  Nie są to bynajmniej rojenia paranoika.  Niemcy wielokrotnie w historii stosowali takie metody.  Po raz pierwszy w okresie rozbiorów Polski, kiedy to Prusacy zachęcali Polaków do zadłużania się.  Stąd wzięły się słynne "bajońskie sumy", czyli długi polskiej szlachty, spłacone dopiero przez Napoleona.  Warto też wspomnieć o tym, iż niemieckie gazety zupełnie poważnie proponowały Grekom sprzedaż greckich wysp dla pokrycia długów.  W swojej wczorajszej notce {TUTAJ} UPARTY także wspomina o Niemcach:  "NIe znamy przebiegu rozmów między Kaczyńskim a Merklową po ciamajdanie ale wiadomości powszechnej przeszło jedno jej oświadczenie. Mianowicie miała ona stwierdzić, że aczkolwiek Kaczyński utrzymał władzę to ona wie co on chce zrobić. Wie, że chce zbudować państwo od nowa a to zupełnie inna sprawa i dużo trudniejsza niż utrzymanie się przy władzy. Krótko mówiąc powiedziała, że o ile jest w stanie pogodzić się z porażka przy próbie odsunięcia Kaczyńskiego od władzy to będzie się sprzeciwiać  odbudowie państwa rozumianego jako powiązane ze sobą, nakierowane na rację stanu instytucje.Oczywiście, realna siła Niemiec nie jest wystarczająca by naszym staraniom zapobiec, ale jest wystarczająca do sabotażu polityki PiS`u polegającym na wzmocnieniu oporu przeciwników.Dlaczego Niemcy nie chcą sprawnej administracji w Polsce?Odpowiedź na to jest w sumie banalnie prosta. Chodzi o model rozwoju gospodarczego.PO i reszta ich obozu przekonuje, że najbardziej właściwy jest model polaryzacyjno-dyfuzyjny. W skrócie oznacza on, że inwestycje w regiony biedniejsze mają na celu ich powiązanie  regionami bogatszymi, że te inwestycje mają tak naprawdę pomóc nie regionom biedniejszym a regionom bogatszym, bo mają być dokonywane w ich interesie. Zwolennicy tego modeli podają za przykład dysproporcję między Warszawą a nie odległymi od niej terenami wschodniej Polski. Twierdzą też, że z czasem “bogactwo się rozleje”.Oczywiście jest to jakaś koncepcja, którą można byłoby poważnie rozważać, gdyby rzeczywiście Warszawa była górnym kresem, najważniejszym ośrodkiem  systemu gospodarczego, ale nie jest. System gospodarczy jest ponad narodowy i jego górnym kresem w Europie jest Dorzecze Renu. Jeżeli przyjmiemy model dyfuzyjno-polaryzacyjny to umożliwia on również transfer kapitałów i ogólnie mówiąc zamożności  z Warszawy do Dorzecza Renu a do Warszawy przyszły one z okolicy, z reginów biedy. To właśnie z racji stosowania tego modelu 80% pomocy unijnej trafia finalnie do krajów z których ta pomoc pochodzi a koszty spożytkowania tej pomocy są kosztami krajowymi.Bo zgodnie z tym modelem inwestycje mają nas zintegrować z większym od nas centrum, czyli w skrócie mówiąc z Niemcami.Po to jednak by ludzie w Warszawie i wielu innych miastach żyli wyraźnie lepiej niż w najbiedniejszych regionach kraju, by nie było widać wysycania kapitału dalej na zachód, umożliwiono dwie rzeczy. Po pierwsze oszukiwanie państwa i życie z oszustw a po drugie dojenie słabszych- czyli życie z przelewu majątku.Gdy te dochody zostały ograniczone szeregi antypisu zostały zasilone przez ludzi poszkodowanych nie tylko odebraniem możliwości kombinowania z podatkami ale również i tych, którzy żyli w symbiozie z modelem rozwoju gospodarczego.".  Jak więc widać - UPARTY, w przeciwieństwie do Warzechy, nie ma złudzeń co do dalekosiężnych planów Niemiec.
Ocena wpisu: 
Średnio: 5 (głosów:2)
Kategoria wpisu: 

          Cert Prep Microsoft Azure Administrator Certification Transition Exam (AZ-102)      Cache   Translate Page      

Cert Prep  Microsoft Azure Administrator Certification Transition Exam (AZ-102)
Cert Prep: Microsoft Azure Administrator Certification Transition Exam (AZ-102)
MP4 | Video: h264, 1280x720 | Audio: AAC, 48 KHz, 2 Ch
Genre: eLearning | Language: English + .SRT
Level: Intermediate | Duration: 2h 42m | 327 MB


           Microsoft porta su Linux alcune delle note e apprezzate utilità Sysinternals       Cache   Translate Page      
Mark Russinovich, oggi CTO di Microsoft Azure, è un pezzo da novanta: prima di entrare nell'azienda di Redmond fondò - insieme con Bryce Cogswell - una società chiamata Winternals che progettava e sviluppava alcune tra le più famose utilità per la gestione, l'ottimizzazione e la messa in sicurezza di Windows.
Utilità come Process Monitor e Process Explorer, che abbiamo frequentemente citato nei nostri articoli, sono proprio di Russinovich e della sua azienda Winternals che nel 2006 fu acquisita da Microsoft (il gruppo di lavoro fu chiamato Sysinternals).


Adesso Microsoft ha deciso di portare su Linux molte delle utilità Sysinternals: e anche questa è una notizia, che conferma l'interesse e i continui investimenti che la società, sotto la guida di Satya Nadella, sta rivolgendo nei confronti del 'pinguino', dell'opensource e del software libero.
Un altro recentissimo esempio: Microsoft entra in Open Invention Network e concede 60.000 suoi brevetti.

La prima utilità Sysinternals che viene portata su Linux è Procdump: essa consente di creare un dump dei processi in esecuzione o di eventuali crash di sistema al verificarsi di particolari eventi (i.e.
          A 26-Year Experience with Microsurgical Reconstruction of Hemifacial Atrophy and Linear Scleroderma      Cache   Translate Page      
imageBackground: Parry Romberg disease (hemifacial atrophy) and linear scleroderma (coup de sabre) are progressive, usually unilateral facial atrophies of unknown cause. The gold standard treatment for these patients has been microsurgical reconstruction following the “burning out” of the facial atrophy and stable contour for 2 years. Methods: The authors report their experience treating patients with hemifacial atrophy and linear scleroderma with free tissue transfers between 1989 and 2016. A modified parascapular flap based on the circumflex scapular artery was most commonly used. Results: A total of 177 patients were included. The most common complication was hematoma, occurring in 12 patients (7 percent). Follow-up ranged from 1 to 26 years. All patients subjectively experienced improved facial symmetry and aesthetics. No disease process has recurred to date, even in cases of severe, progressive disease. Conclusions: In the authors’ experience, patients treated early in their disease course have immediate and sustained correction of their deformity, with slowing or in most cases cessation of the disease process following free tissue transfer. The authors now advocate for immediate reconstruction for active disease, especially in young children. CLINICAL QUESTIONS/LEVEL OF EVIDENCE: Therapeutic, IV.
          Updates in Head and Neck Reconstruction      Cache   Translate Page      
imageLearning Objectives: After reading this article, the participant should be able to: 1. Have a basic understanding of virtual planning, rapid prototype modeling, three-dimensional printing, and computer-assisted design and manufacture. 2. Understand the principles of combining virtual planning and vascular mapping. 3. Understand principles of flap choice and design in preoperative planning of free osteocutaneous flaps in mandible and midface reconstruction. 4. Discuss advantages and disadvantages of computer-assisted design and manufacture in reconstruction of advanced oncologic mandible and midface defects. Summary: Virtual planning and rapid prototype modeling are increasingly used in head and neck reconstruction with the aim of achieving superior surgical outcomes in functionally and aesthetically critical areas of the head and neck compared with conventional reconstruction. The reconstructive surgeon must be able to understand this rapidly-advancing technology, along with its advantages and disadvantages. There is no limit to the degree to which patient-specific data may be integrated into the virtual planning process. For example, vascular mapping can be incorporated into virtual planning of mandible or midface reconstruction. Representative mandible and midface cases are presented to illustrate the process of virtual planning. Although virtual planning has become helpful in head and neck reconstruction, its routine use may be limited by logistic challenges, increased acquisition costs, and limited flexibility for intraoperative modifications. Nevertheless, the authors believe that the superior functional and aesthetic results realized with virtual planning outweigh the limitations.
          Going async with Azure and the PHP SDK for a massive performance boost      Cache   Translate Page      

Regular readers will know that I make extensive use of Azure Table Storage in bothReport URI andSecurity Headers. As Report URI has grown and we’re now processing billions of reports per month for our users we’re always interested in performance or efficiency savings wherever possible. We recently launched a whole newbunch of features and behind the scenes there was a pretty sweet performance tweak.


Going async with Azure and the PHP SDK for a massive performance boost
A quick intro to Table Storage

I have loads of blogs on our use oftable storage so you can always dig into those if you want the technical details but at a really high level Table Storage is a key:value store. You can put something in with a key and then query it back out with the key, and, well, that’s pretty much it! The key is actually a combination of two parts, called the Partition Key and Row Key, which allow for some flexibility when querying, and entities (rows for relational database people) can have properties (columns, except no schema). We put reports into Table Storage using a variety of PK and RK combinations that allow us to query them out really efficiently and we also store a running counter that we increment ourselves as that’s faster and easier than trying to count all of the rows when you’re looking for something like the total number of reports for a given day for example.

PK: csp26092018 RK: total count: 12489 The problem

If we want to draw a graph of CSP reports for the last week then we need to pull back 7 entities, the total count for today and the 6 previous days.

for ($i = 0; $i < $unitsToGoBack; $i++) { partitionKey = $reportType . date('dmY', strtotime("-{$i} days")); try { $entity = $this->tableRestProxy->getEntity($this->table_name, $partitionKey, 'total')->getEntity();

These 7 getEntity() calls are made against Table Storage sequentially and they get us the data we need to plot onto the graph, we can now return the page and draw the graph for the user. This all works really well and we can go up to the last month in this fashion without any real performance worry because Table Storage is seriously fast. When we switch to looking at reports for a whole month, and say the last 6 months, we don’t pull out 180 counts for each day, we also maintain a count for the month so we can just retrieve that, requiring only 6 getEntity() calls. What happens when we can’t work around many calls like that though? In the last update I wanted to add an overview graph for the last 7 days for all report types. That meant I needed to get the day count for each of the 7 days for each of the 5 report types we support, CSP, HPKP, CT, XSS and Staple. That means 35 sequential calls into Table Storage and while Table Storage is really fast, we’re getting into territory where that many sequential calls is starting to be too many. To add to that problem we also went and added 4 new reports types didn’t we… That now means we need 63 sequential getEntity() calls into Azure to get all of the entities needed to render the graph, not good. There’s really no way for us to change or get around requiring that many calls, we just needed to do them faster and that meant we needed to async them.

Async and php

People familiar with other languages may have long used the concept of async requests with promises and other constructs, but in PHP, and especially for me, this was a more recent development. I start to dig around in the Azure Storage PHP SDK and noticed that it’s using Guzzle and has support for making async requests to Table Storage but doesn’t do so by default. At least this means it shouldn’t be too hard for us to add support!

/** * Gets table entity. * * @param string $table The name of the table. * @param string $partitionKey The entity partition key. * @param string $rowKey The entity row key. * @param GetEntityOptions|null $options The optional parameters. * * @return GetEntityResult * * @see http://msdn.microsoft.com/en-us/library/windowsazure/dd179421.aspx */ public function getEntity( $table, $partitionKey, $rowKey, GetEntityOptions $options = null ) { return $this->getEntityAsync( $table, $partitionKey, $rowKey, $options )->wait(); }

Given the really simple nature of our getEntity() loop that we looked at earlier, all we need to do is batch up these calls and dispatch all of them together.

$promises = []; foreach ($this->config->item('valid_report_types') as $reportType) { for ($i = 0; $i < 7; $i++) { $promises[$reportType . $i] = $this->tableRestProxy->getEntityAsync('reports' . $tableId, $reportType . date('dmY', strtotime("-{$i} days")), 'total'); } }

Once we have our batch of requests they simply need to be executed and have the results returned. There are a couple of different ways to do this and it depends on what your tolerance to a request failing is.

// Throws exception if any request fails $results = Promise\unwrap($promises) // Waits for all requests to complete, even if some fail. $results = Promise\settle($promises)->wait();

Because we’re doing getEntity() calls we may get a 404 back from Table Storage if the entity we ask for doesn’t exist, which would be classed as a failure and stop the rest of the requests if we were using unwrap, so settle is what we need. Because we can expect to see a 404, in which case it means we have no reports for that particular day, we need to be tolerant of that and simply count the failure as a 0 count. Now that we have the result of all of our queries back we can process the entities we got and populate the array for the graph as we would have done before. The big question is, how much of a performance gain did we see?

The results

Now, when I’m developing and testing features I’m doing that locally on my dev server at home so we need to bear in mind the impact that has on the numbers. When my server at home calls back to Azure it has a considerably higher latency because I’m in the UK (basically in a field) and our Azure instance is on the West Coast of the USA. Our normal production servers are also located on the West Coast and are roughly 4ms away. That said, the relative difference between async and non-async will be the same, it’s just that my numbers look a lot bigger here because of my location.

float(6.2748548984528) [["19-09-2018",0,0,0,0,0],["20-09-2018",0,0,0,0,0],["21-09-2018",0,0,0,0,0],["22-09-2018",0,0,0,0,0],["23-09-2018",0,0,0,0,0],["24-09-2018",0,0,0,0,0],["25-09-2018",0,0,0,0,0]] float(1.2707870006561) [["19-09-2018",0,0,0,0,0],["20-09-2018",0,0,0,0,0],["21-09-2018",0,0,0,0,0],["22-09-2018",0,0,0,0,0],["23-09-2018",0,0,0,0,0],["24-09-2018",0,0,0,0,0],["25-09-2018",0,0,0,0,0]]

That’s some pretty significant improvement, a 79.75% reduction, and you can probably guess which array was generated using the synchronous calls and which one was generated using the asynchronous calls! It’s also worth noting that only 5 counts are being returned here each day because I tested and deployed this before we had support for the 4 new report types fully implemented, this will save us even more in that scenario.

Limitations on where we can use this

Doing async calls into Table Storage like this is no doubt a huge performance boost and it doesn’t cost any more or less for us to do it, it’s exactly the same. Of course this means we want to do this wherever possible but it’s not as simple as being able to execute all of our queries async in this fashion. The getEntity() calls above are a great example of where this works perfectly, we’re providing the table, partition key and row key of the entity which will either be returned or won’t exist and give us a 404, nice and easy. A similar thing could be said for insertEntity too, we generate an entity locally and call insertEntity() which will either succeed or fail and we could call multiple inserts together. You'd normally use an Entity Group Transaction (EGT) for that but if the entities are in different partitions then you can't, you can however use this method. What about less specific queries though, like queryEntities())?

queryEntities()

We use queryEntities() a lot in Report URI, it’s how we populate the tables on the Reports page with all of the reports that match your chosen search criteria.


Going async with Azure and the PHP SDK for a massive performance boost

A page like that would result in a really simple query that looks like this where simply fetch all CSP reports for the current hour.

PartitionKey eq 'csp2809201815'

One of the limitations of queryEntities() is that it can only return a maximum of 1,000 entities at a time. If there are more than 1,000 entities then the result of the query will contain a continuation token that allows you to call back into Table Storage to fetch the rest of the Entities.

$result = $this->tableRestProxy->queryEntities($table, "PartitionKey eq 'csp2809201815'"); $reports = $result->getEntities(); $nextPartitionKey = $result->getNextPartitionKey(); $nextRowKey = $result->getNextRowKey(); while(!is_null($nextRowKey) && !is_null($nextPartitionKey)) { $options = new QueryEntitiesOptions(); $options->setNextPartitionKey($nextPartitionKey); $options->setNextRowKey($nextRowKey); $options->setFilter(Filter::applyQueryString($filter)); $newResult = $this->tableRestProxy->queryEntities($table, $options); $reports = array_merge($newResult->getEntities(), $reports); $nextPartitionKey = $newResult->getNextPartitionKey(); $nextRowKey = $newResult->getNextRowKey(); }

This means that if we want to pull back 5,000 entities as a result of our query then we have to do those 5 calls sequentially because there’s no way to know the continuation token until we’ve pulled back the result. That puts us in a bit of a pickle but whilst I was looking at this it did give me an idea of how we might be able to work around that limitation... If you’re familiar with Table Storage and have an idea let me know in the comments below but for now I will probably save that for another blog as it’s a little too long to go into here!

Future ideas

As I said, there’s an idea burning away in my mind on how we can get around the 1,000 entities limit but also where else in our code we can do some trickery to allow us to squeeze more performance out of things by using async calls to Table Storage where it might not be immediately obvious and one of them would be another pretty significant performance improvement too. For now, I hope the code and examples above might help others using Azure and the Storage PHP SDK and give you an insight into the inner workings and development of Report URI!


          世界是 container 的,也是 microservice 的,但最终还是 serverless 的.      Cache   Translate Page      

副标题是这样的: “Hyper,Fargate,以及 Serverless infrastructure”。

世界上有两种基础设施,一种是拿来主义,另一种是自主可控。
原谅我也蹭个已经被浇灭的、没怎么火起来的热点。不过我们喜欢的是拿来主义,够用就行,不想也不需要过多的控制,也不想惹过多的麻烦,也就是 fully managed。
之所以想到写这篇文章,源于前几天看到的这篇来自微软 Azure 的博客内容: The Future of Kubernetes Is Serverless ,然后又顺手温习了一遍 AWS CTO 所撰写的 Changing the calculus of containers in the cloud 这篇文章。这两篇文章你觉得有可能有广告的嫌疑,都是在推销自家的共有云服务,但是仔细品味每一句话,我却觉得几乎没有几句废话,都很说到点子上,你可以点击进去看下原文。
有个前提需要说明的是,这里的 Serverless 指的是 Serverless infrastructure,而不是我们常听到的 AWS Lambda,Microsoft Azure Functions 或 Google Cloud Functions 等函数(功能)即服务(FaaS)技术,为了便于区分,我们将这些 FaaS 称为无服务器计算,和我们本文要介绍的无服务器基础设施还是不一样的。

IaaS:变革的开始

说到基础设施,首先来介绍下最先出现的 IaaS,即基础设施即服务。IaaS 免除了大部分硬件的 provision 工作,没人再关心机架、电源和服务器问题,使得运维工作更快捷,更轻松,感觉解放了很多人,让大家走上了富裕之路。
当然这一代的云计算服务,可不只是可以几分钟启动一台虚拟机那么简单。
除了 VM 之外, IaaS 厂商还提供了很多其他基础设施和中间件服务,这些组件被称为 building block ,比如网络和防火墙、数据库、缓存等老三样,最近还出现了非常多非常多的业务场景服务,大数据、机器学习和算法,以及IoT等,看起来就像个百货商店,使用云计算就像购物,架构设计就是购物清单,架构里的组件都可以在商店里买到。
基础设施则使用 IaaS 服务商所提供的各种服务,编写应用程序可以更专注于业务。这能带来很多好处:
  • 将精力集中投入到核心业务
  • 加快上线速度
  • 提高可用性
  • 快速扩缩容
  • 不必关心中间件的底层基础设施
  • 免去繁杂的安装、配置、备份和安全管理等运维工作
在 AWS 成为业界标准之后,各大软件公司,不管是新兴的还是老牌的,都开始着手打造自己的云,国外有微软、谷歌、IBM等,国内的 BAT 也都有自己的云,以及京东和美团这样的电商类公司也有自己的云产品,独立的厂商类似 UCloud 和青云等公司也发展的不错,甚至有开饭馆的也要出来凑热闹。而开源软件 OpenStack 和基于 OS 的创业公司和产品也层出不穷。
全民皆云。

容器:云计算的深入人心

之后在 2013 年,容器技术开始面向大众普及了。在 LXC 之前,容器对普通开发人员甚至 IT 业者来说几乎不是同一个维度的术语,那是些专业人员才能掌控的晦涩的术语和繁杂的命令集,大部分人都没有用过容器技术;但是随着 Docker 的出现,容器技术的门槛降低,也在软件行业变得普及。随着几年的发展,基本可以说容器技术已经非常成熟,已成为了开发的标配。
随着容器技术的成熟和普及,应用程序架构也出现了新的变化,可以说软件和基础设施的进化相辅相成。人们越来越多的认识到对技术栈的分层和解耦更加重要,不同层之间的技术和责任、所有权等界限清晰明了,这也和软件设计中的模块松耦合原则很相像。
在有了责权明晰的分层结构之后,每个人可以更容易集中在自己所关注的重点上。开发人员更关注应用程序本身了,在 Docker 火了的同时,也出现了 app-centric 的概念。甚至 CoreOS 还将自己对抗 OCI/runc 的标准称为 appc 。当然现在的 Docker 也不是原来的 Docker ,也是一个组件化的东西,很多组件,尤其是 runtime ,都可以替换为其他运行时。
和以应用程序为重心相对应的是传统的以基础设施为中心,即先有基础设施,围绕基础设施做架构设计和开发、部署,受基础设施的限制较多。而随着 IaaS 等服务的兴起,基础设施越来越简单,越来越多容易入手,而且还提供了编程化的接口,开发人员也可以非常方便的对基础设施进行管理,可以说云计算的出现也使得开发人员抢了一部分运维人员的饭碗(遗憾的是这种趋势太 high 了停不下来。。。)。
当然,现在以应用为中心这一概念也已经深入人心。特别是进化到极致的 FaaS ,自己只需要写几行代码,其他平台全给搞定了。

编排:兵家必争之地

容器解决了代码的可移植性的问题,也使得在云计算中出现新的基础设施应用模式成为可能。使用一个一致的、不可变的部署制品,比如镜像,可以让我们从复杂的服务器部署中解脱出来,也可以非常方便的部署到不同的运行环境中(可移植性)。
但是容器的出现也增加了新的工作内容,要想使用容器运行我们的代码,就需要一套容器管理系统,在我们编写完代码,打包到容器之后,需要选择合适的运行环境,设置好正确的扩容配置,相关的网络连接,安全策略和访问控制,以及监控、日志和分布式追踪系统。
之所以出现编排系统,就是因为一台机器已经不够用了,我们要准备很多机器,在上面跑容器,而且我不关心容器跑在哪台机器上,这个交给调度系统就行了。可以说,从一定层面上,编排系统逐渐淡化了主机这一概念,我们面对的是一个资源池,是一组机器,有多少个 CPU 和多少的内存等计算资源可用。
rkt vs Docker 的战争从开始其实就可以预料到结局,但在编排系统/集群管理上,这场“战争”则有着更多的不确定性。
Mesos(DC/OS)出来的最早,还有 Twitter 等公司做案例,也是早期容器调度系统的标配;Swarm 借助其根正苗红以及简单性、和 Docker 的亲和性,也要争一分地盘;不过现在看来赢家应该是 K8s,K8s 有 Google 做靠山,有 Google 多年调度的经验,加上 RedHat/CoreOS 这些反 Docker 公司的站队,社区又做得红红火火,总之是赢了。
据说今年在哥本哈根举办的 Kubecon 有 4300 人参加。不过当初 Dockercon 也是这声势,而现在影响力已经没那么大了,有种昨日黄花、人老色衰的感觉,不知道几年之后的 Kubernetes 将来会如何,是否会出现新的产品或服务来撼动 Kubernetes 现在的地位?虽然不一定,但是我们期待啊。

Serverless infrastructure:进化的结果

但是呢,淡化主机的存在性也只是淡化而已,并没有完全消除主机的概念,只是我们直接面向主机的机会降低了,不再直接面向主机进行部署,也不会为某些部门分配独占的主机等。主机出了问题还得重启,资源不够了还得添加新的主机,管理工作并没有完全消失。
但是管理一套集群带来了很大的复杂性,这也和使用云计算的初衷相反,更称不上云原生。
从用户的角度再次审视一下,可以发现一个长时间被我们忽略的问题:为什么只是想运行容器,非得先买一台 VM 装好 Docker,或者自己搭建一套 Kubernetes 集群,或者使用类似 EKS 这样的服务,乐此不疲的进行各种配置和调试,不仅花费固定的资产费,还增加了很多并没有任何价值的运维管理工作。
既然我们嫌弃手动在多台主机中部署容器过于麻烦,将其交给集群管理和调度系统去做,那么维护调度系统同样繁杂的工作,是不是也可以交给别人来做,外包出去呢?
按照精益思想,这些和核心业务目标无关,不能带来任何用户价值的过程,都属于浪费行为,都需要提出。
这时候,出现了 Serverless infrastructure 服务,最早的比如国内的 hyper.sh (2016.8 GA),以及去年发布的 AWS 的 Fargate(2017.12),微软的 ACI(Azure Container Instance,2017.7) 等。
以 hyper.sh 为例,使用起来和 Docker 非常类似,可以将本地的 Docker 体验原封不动的搬到云端:
$ brew install hyper 
$ hyper pull mysql
$ hyper run mysql
MySQL is running...
$ hyper run --link mysql wordpress
WordPress is running...
$ hyper fip attach 22.33.44.55 wordpress
22.33.44.55
$ open 22.33.44.55
大部分命令从 docker 换成 hyper 就可以了,体验如同使用 Docker 一模一样,第一次看到这样的应用给人的新奇感,并不亚于当初的 Docker 。
使用 Serverless infrastructure,我们可以再不必为如下事情烦恼:
  • 不必再去费心选择 VM 实例的类型,需要多少 CPU 和内存
  • 不必再担心使用什么版本的 Docker 和集群管理软件
  • 不必担心 VM 内中间件的安全漏洞
  • 不必担心集群资源利用率太低
  • 从为资源池付费变为为运行中的容器付费
  • 完全不可变基础设施
  • 不用因为 ps 时看到各种无聊的 agent 而心理膈应
我们需要做的就是安心写自己的业务应用,构建自己的镜像,选择合适的容器大小,付钱给 cloud 厂商,让他们把系统做好,股票涨高高。

Fargate(此处也可以换做 ACI ):大厂表态

尽管 AWS 不像 GCP 那样“热衷”于容器,但是 AWS 也还是早就提供了 ECS(Elastic Container Service)服务。
去年发布的 AWS Fargate 则是个无服务器的容器服务,Fargate 是为了实现 AWS 的容器服务,比如 ECS(Elastic Container Service) 和 EKS(Elastic Kubernetes Service) 等,将容器运行所需要的基础设施进行抽象化的技术,并且现在 ECS 已经可以直接使用 Fargate。
和提供虚拟机的 EC2 不同,Fargate 提供的是容器运行实例,用户以容器作为基本的运算单位,而不必担心底层主机实例的管理,用户只需建立容器镜像,指定所需要的 CPU 和内存大小,设置相应的网络和IAM(身分管理)策略即可。
对于前面我们的疑问,AWS 的答案是基础设施的坑我们来填,你们只需要专心写好自己的应用程序就行了,你不必担心启动多少资源,我们来帮你进行容量管理,你只需要为你的使用付费就行了。
可以说 Fargate 和 Lambda 等产品都诞生于此哲学之下。
终于可以专心编写自己最擅长的 CRUD 了,happy,happy。

Serverless infrastructure vs Serverless compute

再多说几句,主要是为了帮助大家辨别两种不同的无服务器架构:无服务器计算和无服务器基础设施。
说实话一下子从 EC2 迁移到 Lambda ,这步子确实有点大。
Lambda 等 FaaS 产品虽然更加简单,但是存在有如下很多缺点:
  • 使用场景:Lambda 更适合用户操作或事件驱动,不适合做守护服务、批处理等业务
  • 灵活性:固定的内核、AMI等,无法定制
  • 资源限制:文件系统、内存、进程线程数、请求 body 大小以及执行时间等很多限制
  • 编程语言限制
  • 很难做服务治理
  • 不方便调试和测试
Lambda 和容器相比最大的优势就是运维工作更少,基本没有,而且计费更精确,不需要为浪费的计算资源买单,而且 Lambda 响应更快,扩容效率会高一些。
可以认为 Fargate 等容器实例,就是结合了 EC2 实例和 Lambda 优点的产品,既像 Lambda 一样轻量,更关注核心的应用程序,还能像 EC2 一样带来很大的灵活性和可控性。
云原生会给用户更多的控制,但是需要用户更少的投入和负担。
Serverless infrastructure 可以让容器更加 cloud native。

fully managed:大势所趋

所谓的 fully managed,可以理解为用户花费很少的成本,就可以获得想要的产品、服务并可以进行相应的控制。
这两天,阿里云发布了 Serverless Kubernetes ,Serverless Kubernetes 与原生的 Kubernetes 完全兼容,可以采用标准的 API、CLI 来部署和管理应用,还可以继续使用各种传统资产,并且还能获得企业级的高可用和安全性保障。难道以后我们连 Kubernetes 也不用自己装了,大部分人只需要掌握 kubectl 命令就好了。
IaaS 的出现,让我们丢弃了各种 provision 工具,同时,各种 configuration management 工具如雨后春笋般的出现和普及;容器的出现,又让我们扔掉了刚买还没看几页的各种 Chef/Puppet 入门/圣经,匆忙学起 Kubernetes;有了 Serverless infrastructure,也差不多可以和各种编排工具说拜拜了。
不管你们是在单体转微服务,还是在传统上云、转容器,估计大家都会喜欢上 fully managed 的服务,人人都做 Ops,很多运维工作都可以共同分担。当然,也会有一部分运维工程师掩面而逃.

          Get started with delivering content on demand using REST      Cache   Translate Page      
This tutorial walks you through the steps of implementing an on demand content delivery application with Azure Media Services using REST API.
          Troubleshoot a web app in Azure App Service using Visual Studio      Cache   Translate Page      
Learn how to troubleshoot an Azure web app by using remote debugging, tracing, and logging tools that are built in to Visual Studio 2013.
          Develop and deploy WebJobs using Visual Studio - Azure      Cache   Translate Page      
Learn how to develop and deploy Azure WebJobs to Azure App Service using Visual Studio.
          Workplace Violence      Cache   Translate Page      
imageHow it affects health care, which providers are most affected, and what management and staff can do about it.
          Azure Sales and Marketing - Ingram Micro Cloud - Bellevue, WA      Cache   Translate Page      
If so, join the Ingram Micro Cloud team - where rainmakers thrive. Are you an innovative, self-starting Marketing guru who loves helping IT providers design and...
From Ingram Micro Cloud - Fri, 28 Sep 2018 07:14:09 GMT - View all Bellevue, WA jobs
          VS 2013 and PCL issues      Cache   Translate Page      

The other day I installed Windows 8.1 RTM and VS 2013 RC to try it out and use it as my main development platform for my Windows 8 apps. I then booted up VS 2013 and was going to continue development on podstaX to make it use a Portable Class library for the shared code and separate projects for the Windows 8 and Windows Phone 8 UI. But when I tried to compile i got the following errors (as of now I didn’t re-target the Windows 8 project to a Windows 8.1 project).

The type or namespace name 'WindowsAzure' does not exist in the namespace 'Microsoft' (are you missing an assembly reference?)
The type or namespace name 'MobileServiceClient' could not be found (are you missing a using directive or an assembly reference?)
The primary reference "Microsoft.WindowsAzure.Mobile" could not be resolved because it has an indirect dependency on the framework assembly "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "Microsoft.WindowsAzure.Mobile" or retarget your application to a framework version which contains "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "System.Net.Http.Extensions" could not be resolved because it has an indirect dependency on the framework assembly "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "System.Net.Http.Extensions" or retarget your application to a framework version which contains "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "Microsoft.Threading.Tasks" could not be resolved because it has an indirect dependency on the framework assembly "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "Microsoft.Threading.Tasks" or retarget your application to a framework version which contains "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "Microsoft.Threading.Tasks.Extensions" could not be resolved because it has an indirect dependency on the framework assembly "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "Microsoft.Threading.Tasks.Extensions" or retarget your application to a framework version which contains "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "System.Net.Http" could not be resolved because it has an indirect dependency on the framework assembly "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "System.Net.Http" or retarget your application to a framework version which contains "System.Runtime, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "Microsoft.WindowsAzure.Mobile" could not be resolved because it has an indirect dependency on the framework assembly "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "Microsoft.WindowsAzure.Mobile" or retarget your application to a framework version which contains "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "System.Net.Http.Extensions" could not be resolved because it has an indirect dependency on the framework assembly "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "System.Net.Http.Extensions" or retarget your application to a framework version which contains "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "Microsoft.Threading.Tasks" could not be resolved because it has an indirect dependency on the framework assembly "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "Microsoft.Threading.Tasks" or retarget your application to a framework version which contains "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "Microsoft.Threading.Tasks.Extensions" could not be resolved because it has an indirect dependency on the framework assembly "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "Microsoft.Threading.Tasks.Extensions" or retarget your application to a framework version which contains "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".
The primary reference "System.Net.Http" could not be resolved because it has an indirect dependency on the framework assembly "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETPortable,Version=v4.0,Profile=Profile158". To resolve this problem, either remove the reference "System.Net.Http" or retarget your application to a framework version which contains "System.Threading.Tasks, Version=1.5.11.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a".

I took a while until I realized how to resolve the issue. The main problem was the targeting of the PCL project! I use the Azure Mobile Services SDK loaded via NuGet in that project and for some reason VS 2013 cannot handle a PCL with the AMS SDK and Silverlight 5(Windows Phone 7.8) targets.

So to resolve this I did the following:

  1. Un-installed the Azure Mobile Services SDK via Nuget Package Manager.

    image 
  2. Change the target framework for the PCL project to

    image
  3. And then install the Azure Mobile Services SDK again.

    image
  4. Compile and enjoy!



          How to redirect to another URL with headers using response from HttpWebRequest in asp.net?      Cache   Translate Page      

I'm using asp.net web forms application to call an aspx page in a IE modal dialog, which on Page load calls HttpWebRequest with headers and POST with JSON data as below:

HttpWebRequest request = (HttpWebRequest)WebRequest.Create(URL + "?ID=" +ID);

NameValueCollection nameValueCollection = new NameValueCollection
{
    { "firstName", firstName },
    { "lastName", lastName },
    { "userName", userName },
    { "ApiKey", ApiKey },
    { "ApiSecret", ApiSecret },
    { "x-requestKey", Gluid },
};
request.Headers.Add(nameValueCollection);
request.Accept = "application/json";
request.ContentType = "application/json";
request.Method = "POST";
request.AllowAutoRedirect = true;
byte[] bytes = Encoding.ASCII.GetBytes(dataSent);

request.ContentLength = bytes.Length;
Stream os = request.GetRequestStream();
os.Write(bytes, 0, bytes.Length); //Push it out there
os.Close();
HttpWebResponse httpWebResponse = (HttpWebResponse)request.GetResponse();

The call to the URL is an Azure website that should redirect to another page like:

http://URL/Home/index/ID

I receive this URI response in the httpWebResponse.ResponseURI property but it doesn't do anything. Ideally this Azure website automatically redirects to the URI received in Response URI but it is not working for me. I get a Response status 'OK' but I get a blank page after that.

The call to the ResponseURI should be a GET request with the same custom headers with Accept and ContentType as shown in the POST request above. Response.Redirect don't work with headers. I'm getting redirected to an Error page on the Azure website without the headers on the 2nd request with Response.Redirect.

Response.Redirect(httpWebResponse.ResponseUri.ToString(), true);


          How a Windows Update Can Spark New IoT Services on Campuses      Cache   Translate Page      
How a Windows Update Can Spark New IoT Services on Campuses eli.zimmerman_9856 Tue, 11/06/2018 - 13:29

Microsoft announced a new update to its Windows 10 OS in October that will open doors to a new level of connectedness on college campuses.

Integration of the Internet of Things has become a common goal for higher education institutions as universities strive for a connected campus model. 

While many colleges are focusing heavily on IoT as part of their digital transformation initiatives, administrators and IT leaders are facing certain hurdles, particularly concerning data quality and management, which has slowed down campus innovation.

"Data management is the biggest obstacle we have right now," Gerry Hamilton, director of facilities energy management at Stanford University tells Campus Technology. "It all comes down to scalability and sustainability. We have found there is an exponential growth of effort that happens every time you deploy one more system."

The new update promises to bring “edge intelligence with machine learning, industrial strength security” and “diverse silicon options” to users. 

MORE FROM EDTECH: Check out how universities are preparing their networks for IoT integration!

Machine Learning Improves IoT Management

To help users get a firm grasp on their IoT data management, the new Windows 10 update will employ Microsoft Azure IoT Edge and Windows machine learning.

“Windows 10 IoT enables you to do more at the edge including machine learning, event processing, image recognition and other high-value artificial intelligence without developing it in-house,” according to Microsoft’s announcement. “Seamless integration with Azure IoT Edge brings cloud intelligence and analytics securely to Windows 10 IoT devices at scale.”

Azure’s IoT Edge service will allow campus IT teams to run AI workloads and Azure services on Windows 10 IoT devices locally and remotely, easing the burden on campuses to create a web of connected devices. 

At the University of Nebraska–Lincoln, campus staff members are looking into using Azure IoT Edge to help manage sensors that detect facility issues on campus, EdTech reports

“Cloud-based machine learning applications in Microsoft Azure and similar technologies may help systems learn how to reduce the time needed to identify HVAC faults,” says Lalit Agarwal, the university’s director of utility and energy management. “That is definitely an area for us to investigate for the future.”

In addition, the integration of Azure IoT Device Management and Microsoft Intune simplifies device monitoring, allowing campus IT teams to develop management solutions and consolidate device management across a single interface.

Integrate and Analyze Data from Campus Points of Sale

Digital kiosks have become key additions to campus stores, restaurants and stadiums in support of universities’ digital transformation agendas. 

At Clemson University, kiosks in campus mailrooms have helped cut wait times for packages down from 40 minutes to an average of one minute

Through the Windows 10 update, IT teams will find it easier to customize and manage kiosks on campus. Through assigned access, managers can “customize the functionality exposed by kiosks and other fixed-function devices, providing a streamlined, intuitive user experience that is focused on specific tasks,” according to Microsoft.

The update will also provide enhanced status reporting, automatically alerting IT teams when a kiosk is experiencing problems, as well as initiating corrective responses like restarting the device.

Digital%20Transformation_IR_1.jpg

Eli has been eagerly pursuing a journalistic career since he left the University of Maryland's Philip Merrill School of Journalism. Previously, Eli was a staff reporter for medical trade publication Frontline Medical News, where he experienced the impact of continuous education and evolving teaching methods through the medical lens. When not in the office, Eli is busy scanning the web for the latest podcasts or stepping into the boxing ring for a few rounds.


          DevOps : Deploying a Tabular cube with Powershell      Cache   Translate Page      

Introduction

One step in a SQL data warehouse DevOps project is to deploy a SSAS tabular project on an instance. In this blogpost I'm going to show you a script that I'm using for deploying SSAS tabular cubes. As inspiration for creating the deployment script, I used information from a blogger Harbinger Singh. I had to make some changes to the script to make it work in my situation.

Steps

In the script, I've created a couple of blocks of code :
  1. An array of cubes, I want to deploy to the server. This will help me control which cubes to deploy. Another option is to loop over the content of a folder and deploy the cubes.
  2. Create a loop and loop through the array.
  3. Check if the cube is present and print a warning if it can't find the cube.
  4. Adjust the .asdatabase file database connectionstrings. I've multiple connections to databases and they must be changed.
  5. Adjust the .deploymenttargets file database connectionstring.
  6. Generate a .configsettings file. This file is not generated with the build of a SSAS tabular model.
  7. Adjust .configsettings file database connectionstrings with the desired connectionstrings.
  8. Not every cube uses a connectionstring to two databases. There is check whether there is a DWH_Control connectionstring in the .configsettings file. 
  9. Adjust .deploymentoptions file database connectionstrings.
  10. Create the xmla script with AnalysisServices.Deployment wizard.
  11. The last step is to deploy the xmla script to the server with Invoke-ASCmd.

The code

The complete script is written below.

#---------------------------------------------------------------------
# AllCubes.SSASTAB.Dev.Script
#
#---------------------------------------------------------------------
# General variables
$path = "C:\<directory>"
$SSASServer = "localhost"
$DwDBnameDM = "DWH_Datamart"
$DwDBnameCTRL = "DWH_Control"
$DwServerName = "localhost"

# Structure bimname, CubeDB, modelname
$CubeArray = @(
("<filename1>" , "<cubeDB1>" , "<modelname1>"),
("<filename2>" , "<cubeDB2>" , "<modelname2>")
)

cls
Write-Host "------------------------------------"
foreach ($element in $CubeArray) {

$bim = $element[0]
$CubeDB = $element[1]
$CubeModelName = $element[2]

$AsDBpath = "$path\$bim.asdatabase"
$DepTargetpath = "$path\$bim.deploymenttargets"
$ConfigPath = "$path\$bim.configsettings"
$DeployOption = "$path\$bim.deploymentoptions"
$SsasDBconnection = "DataSource=$SsasServer;Timeout=0"
$DwDbDMConnString = "Provider=SQLNCLI11.1;Data Source=$DwServerName;Integrated Security=SSPI;Initial Catalog=$DwDBnameDM"
$DwDbCTRLConnString = "Provider=SQLNCLI11.1;Data Source=$DwServerName;Integrated Security=SSPI;Initial Catalog=$DwDBnameCTRL"
$IsDMConnStrPresent = [bool]0
$IsCTRLConnStrPresent = [bool]0

if (!(Test-Path $AsDBpath)) {
Write-Warning "$AsDBpath absent from location"
Write-Host "------------------------------------"
continue
}

#Adjust .asdatabase file database connectionstring
$json = (Get-Content $AsDBpath -raw) | ConvertFrom-Json
$json.model.dataSources | % {if($_.name -eq 'DWH_DataMart'){$_.connectionString=$DwDbDMConnString ; $IsDMConnStrPresent=[bool]1 }}
$json.model.dataSources | % {if($_.name -eq 'DWH_Control'){$_.connectionString=$DwDbCTRLConnString ; $IsCTRLConnStrPresent=[bool]1 }}
$json | ConvertTo-Json -Depth 10 | set-content $AsDBpath

#Adjust .deploymenttargets file database connectionstring
$xml = [xml](Get-Content $DepTargetpath)
$xml.Data.Course.Subject
$node = $xml.DeploymentTarget
$node.Database = $CubeDB
$node = $xml.DeploymentTarget
$node.Server = $SsasServer
$node = $xml.DeploymentTarget
$node.ConnectionString = $SsasDBconnection
$xml.Save($DepTargetpath)

# generate .configsettings as this file is not generated with the build.
if (($IsDMConnStrPresent) -and ($IsCTRLConnStrPresent)) {
'<ConfigurationSettings xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200" xmlns:ddl300="http://schemas.microsoft.com/analysisservices/2011/engine/300" xmlns:ddl300_300="http://schemas.microsoft.com/analysisservices/2011/engine/300/300" xmlns:ddl400="http://schemas.microsoft.com/analysisservices/2012/engine/400" xmlns:ddl400_400="http://schemas.microsoft.com/analysisservices/2012/engine/400/400" xmlns:ddl500="http://schemas.microsoft.com/analysisservices/2013/engine/500" xmlns:ddl500_500="http://schemas.microsoft.com/analysisservices/2013/engine/500/500" xmlns:dwd="http://schemas.microsoft.com/DataWarehouse/Designer/1.0">
<Database>
<DataSources>
<DataSource>
<ID>DWH_DataMart</ID>
<ConnectionString>Provider=SQLNCLI11.1;Data Source=localhost;Integrated Security=SSPI;Initial Catalog=DWH_Datamart</ConnectionString>
<ManagedProvider>
</ManagedProvider>
<ImpersonationInfo>
<ImpersonationMode xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">ImpersonateServiceAccount</ImpersonationMode>
<Account xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
</Account>
<Password xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
</Password>
<ImpersonationInfoSecurity xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">Unchanged</ImpersonationInfoSecurity>
</ImpersonationInfo>
</DataSource>
<DataSource>
<ID>DWH_Control</ID>
<ConnectionString>Provider=SQLNCLI11.1;Data Source=localhost;Integrated Security=SSPI;Initial Catalog=DWH_Control</ConnectionString>
<ManagedProvider>
</ManagedProvider>
<ImpersonationInfo>
<ImpersonationMode xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">ImpersonateServiceAccount</ImpersonationMode>
<Account xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
</Account>
<Password xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
</Password>
<ImpersonationInfoSecurity xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">Unchanged</ImpersonationInfoSecurity>
</ImpersonationInfo>
</DataSource>
</DataSources>
</Database>
</ConfigurationSettings>' | Out-File -FilePath $path\$bim.configsettings
}
else {
'<ConfigurationSettings xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200" xmlns:ddl300="http://schemas.microsoft.com/analysisservices/2011/engine/300" xmlns:ddl300_300="http://schemas.microsoft.com/analysisservices/2011/engine/300/300" xmlns:ddl400="http://schemas.microsoft.com/analysisservices/2012/engine/400" xmlns:ddl400_400="http://schemas.microsoft.com/analysisservices/2012/engine/400/400" xmlns:ddl500="http://schemas.microsoft.com/analysisservices/2013/engine/500" xmlns:ddl500_500="http://schemas.microsoft.com/analysisservices/2013/engine/500/500" xmlns:dwd="http://schemas.microsoft.com/DataWarehouse/Designer/1.0">
<Database>
<DataSources>
<DataSource>
<ID>DWH_DataMart</ID>
<ConnectionString>Provider=SQLNCLI11.1;Data Source=localhost;Integrated Security=SSPI;Initial Catalog=DWH_Datamart</ConnectionString>
<ManagedProvider>
</ManagedProvider>
<ImpersonationInfo>
<ImpersonationMode xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">ImpersonateServiceAccount</ImpersonationMode>
<Account xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
</Account>
<Password xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
</Password>
<ImpersonationInfoSecurity xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">Unchanged</ImpersonationInfoSecurity>
</ImpersonationInfo>
</DataSource>
</DataSources>
</Database>
</ConfigurationSettings>' | Out-File -FilePath $path\$bim.configsettings
}

#Adjust .configsettings file database connectionstring
$xml = [xml](Get-Content $ConfigPath)
$xml.Data.Course.Subject
$nodeDM = $xml.ConfigurationSettings.Database.DataSources.DataSource | ? { $_.ID -eq $DwDBnameDM }
$nodeDM.ConnectionString = $DwDbDMConnString
$nodeCTRL = $xml.ConfigurationSettings.Database.DataSources.DataSource | ? { $_.ID -eq $DwDBnameCTRL }

# In case here is not a DWH_Control Connectionstring in the .configsettings file
if (![string]::IsNullOrEmpty($nodeCTRL))
{
$nodeCTRL.ConnectionString = $DwDbCTRLConnString
$xml.Save($ConfigPath)
}

#Adjust .deploymentoptions file database connectionstring
$xml = [xml](Get-Content $DeployOption)
$xml.Data.Course.Subject
$node = $xml.DeploymentOptions
$node.ProcessingOption = "DoNotProcess"
$xml.Save($DeployOption)

# Create the xmla script with AnalysisServices.Deployment wizard
Write-Host "Deploying Cube : $CubeDB"
$path = $path
cd $path
$exe = "C:\Program Files (x86)\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\Microsoft.AnalysisServices.Deployment.exe"
$param1 = $bim + ".asdatabase"
$param2 = "/s:" + $bim + ".txt"
$param3 = "/o:" + $bim + ".xmla"
$param4 = "/d"
&($exe)($param1)($param2)($param3)($param4)

Write-Host "Importing SQL modules..."
# import modules
if ((Get-Module -ListAvailable | where-object {($_.Name -eq 'SqlServer') -and ($_.Version.Major -gt 20) } |Measure).Count -eq 1){
# implementation of new sql modules migated into new location
Import-Module SqlServer -DisableNameChecking
}
else{
# fallback for SQLPS
Import-Module SQLPS -DisableNameChecking -Verbose
}

Write-Host "Invoking deployment script... This may take several minutes."
Invoke-ASCmd -Server:$SsasServer -InputFile $path\$bim.xmla | Out-File $path\$bim.xml
Write-Host "Please check $path\$bim.xml as this is output of this deployment"
Write-Host "Done."
Write-Host "------------------------------------"
}



Final thoughts

Althought it is quite a script, it is fairly easy to setup and deploy a cube with a Powershell script. In order to use it in Azure DevOps you have to replace some of the variables with the Azure DevOps variables to make it work as you desire.

Hennie



          Poster Competition Abstracts      Cache   Translate Page      
imageNo abstract available
          Stage: Afstudeeropdracht richting continous delivery in Veenendaal      Cache   Translate Page      
<p align="LEFT"><strong>Smells like team spirit</strong></p> <p align="LEFT">Ben jij niet alleen bezig met het schrijven van code die er toe doet, maar ben jij je er ook van bewust hoe impact vaak gemaakt wordt in een goed en sfeervol team?</p> <p align="LEFT">Binnen een team is het belangrijk dat de sfeer onderling goed is. Hierdoor wordt commitment gecreëerd en wordt er beter samengewerkt. Dit komt de kwaliteit van het product ten goede. Bij outsourced teams is het soms lastig te bepalen hoe goed het team in zijn vel zit. Door de afstand en cultuurverschillen is het mogelijk dat er openheid ontbreekt. Dit zorgt ervoor dat er laat of niet bijgestuurd kan worden en er onderlinge conflicten kunnen ontstaan binnen het team.</p> <p align="LEFT">Als software ontwikkelbedrijf zijn wij constant op zoek naar manieren om processen te automatiseren. Ook hier vragen wij ons af of we op basis van gegevens uit ontwikkeltools kunnen bepalen wat de stemming binnen het team is. Wat is de ontwikkeling van de stemming binnen een team en wat is het effect hiervan op de resultaten? Denk bijvoorbeeld aan analyse op het commitgedrag, de teksten in commits, pull requests, discussies in userstories, etc.</p> <p align="LEFT"><u>Opdracht</u></p> <p align="LEFT">Binnen onze organisatie is veel data beschikbaar van projecten die in het verleden zijn uitgevoerd. Gebruik deze data om onderzoek te doen naar verschillende soorten input en tools die we kunnen gebruiken om de stemming in een team zo goed mogelijk te bepalen. Kijk daarnaast of er een voorspelling gedaan kan worden van het effect op het teamresultaat zodra de stemming in het team om dreigt te slaan. Denk bijvoorbeeld aan Azure Cognitive Services, Text Analytics API en Machine Learning.</p> <p align="LEFT">Bouw een Proof of Concept waarin data uit verschillende ontwikkeltools op een generieke manier kan worden opgehaald. Aggregeer en analyseer deze data om een inschatting te doen van de stemming binnen het team. Gebruik data van historische projecten om het Proof of Concept te testen.</p> ...
          Stage: Afstudeeropdracht richting NET, C#, Big Data, Machine Learning in Veenendaal      Cache   Translate Page      
<p align="LEFT"><strong>Betere energievoorziening met Distributie Automatisering Light</strong></p> <p align="LEFT">Ben jij geïnteresseerd in energie en Big Data? En wil je een bijdrage leveren aan een betrouwbaarder energienet in Nederland?</p> <p align="LEFT">Via een grotendeels onzichtbaar netwerk van kabels en leidingen voorziet Enexis als netbeheerder 2,7 miljoen huishoudens en bedrijven in het noorden, oosten en zuiden van Nederland van elektriciteit en gas.</p> <p align="LEFT">Netbeheerder Enexis heeft begin 2015 besloten een project DALI (Distributie Automatisering Light) te starten met als uiteindelijke doel observering van alle middenspanning (MS) stations (35.000 stuks) in 2020. Bij dit DALI programma worden de MS stations voorzien van een intelligent kastje voor monitoring en sturing waarmee o.a. openbare verlichting kan worden aangestuurd, trafo kWmax kan worden uitgelezen en informatie van kortsluitverklikkers richting het Bedrijfsvoeringscentrum (BVC) gestuurd kan worden.</p> <p align="LEFT">Bovengenoemde ontwikkelingen veroorzaken een toename van sensordata. Met de juiste ICT-voorzieningen levert deze bron van informatie nieuwe mogelijkheden om vroegtijdig onderbrekingen te identificeren en aan de hand van verbeterde analyse specifieker onderhoud te initialiseren. Dit alles om enerzijds sneller te acteren bij opgetreden energieonderbrekingen en anderzijds energieonderbrekingen nog beter te voorkomen.</p> <p align="LEFT"><u>Opdracht</u></p> <p align="LEFT">Bij DALI gaat software gebruikt worden die mede door ons ontwikkeld is om de sensordata te collecteren. Alle data wordt momenteel opgeslagen in een ongestructureerde datastore. En dan kom jij in beeld!</p> <p align="LEFT">Je onderzoekt (in overleg met een Business Expert) op welke wijze de data efficiënt omgezet kan worden naar waardevolle informatie.</p> <p align="LEFT">Vervolgens bewijs je de resultaten van het onderzoek door een Proof of Concept op de Microsoft Azure stack te bouwen, die informatie extraheert uit de gecollecteerde sensordata. Visualiseer vervolgens de informatie in een dashboard.</p> ...
          AMD EPYC處理器正式進駐AWS,比英特爾執行個體便宜10%      Cache   Translate Page      
全球雲端服務龍頭AWS周二(11/6)宣布,即日起將提供基於AMD EPYC處理器的EC 2執行個體(Instance),且價格將比其它執行個體便宜10%,與AWS的結盟讓AMD昨天的股價上漲了3.92%,以20.68美元作收。 去年發表的EPYC為基於AMD Zen架構的x86伺服器處理器,微軟Azure及百度雲端服務都在去年部署了EPYC,甲骨文雲端(Oracle Cloud)也在上個月採納了EPYC,AWS的加入讓AMD進一步搶佔了英特爾於資料中心市場的業務。
          (USA-TX-Allen) Software Engineer- Work With Cutting Edge Technology!!      Cache   Translate Page      
Software Engineer- Work With Cutting Edge Technology!! Software Engineer- Work With Cutting Edge Technology!! - Skills Required - C#/.NET, Visual Studio, PowerShell, Agile/SCRUM Environment, Azure, Microservices, Web API services, System Administration If you are a Software Engineer with experience, please read on! Based in beautiful Allen, TX, we are the world's leading manufacturer of law enforcement video systems, supplying in-car and body worn cameras along with evidence management software to nearly one-third of all law enforcement agencies in the U.S. and Canada. As the industry leader, we has invested over $50 million in research and development through the company's fourteen year history resulting in twenty-six issued or pending patents. We are looking to add a few Software Engineers with strong C#/.NET experience to our growing team. If this sounds like something you'd be interested in, we'd love to tell you more about this opportunity! **What You Will Be Doing** -Influence and create new designs, software, architecture, and methods for deploying large-scale distributed systems -Take part in a 24x7 on-call rotation -Be a contributing member of the teams that define the technical approach, system and software architecture, and API design of our next generation in car and wearable law enforcement video devices and the software that supports them. -Create a technical approach and a design for new features that span the organization. -Explore and define gaps between current platform capabilities, and those needed for new features. -Drive new capabilities and requirements in our platforms to support new product features. -Work with various engineers and architects within the platforms and products teams to improve our adoption of industry best practices for cloud, on premise, and IoT security. -Help define the pipeline and tools necessary to support Full Continuous Delivery **What You Need for this Position** -5+ years of development and/or system administration experience -Experience using a active CD Pipeline to push to production at least once daily -Experience designing and implementing a User Interface and experience with multiple frameworks. -Experience in .NET using C# -Experienced developing web API services -Experienced with Entity Framework, Dapper -Working knowledge of SQL -Visual Studio 2017 -Microsoft Azure -Experience working in an Agile/Scrum environment -Strong hands on experience with PowerShell (Modules, DSC, etc.) -Exceptional public cloud experience with Azure (GOV, Classic vs RM services, etc.) -Strong experience with microservices and containers is a plus -NServiceBus or Service Fabric experience a plus **What's In It for You** - Vacation/PTO - Medical - Dental - Vision - Relocation - Bonus - 401k So, if you are a Software Engineer with experience, please apply today! Applicants must be authorized to work in the U.S. **CyberCoders, Inc is proud to be an Equal Opportunity Employer** All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, or any other characteristic protected by law. **Your Right to Work** – In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification document form upon hire. *Software Engineer- Work With Cutting Edge Technology!!* *TX-Allen* *SM7-1493040*
          Azure Sales and Marketing - Ingram Micro Cloud - Bellevue, WA      Cache   Translate Page      
If so, join the Ingram Micro Cloud team - where rainmakers thrive. Are you an innovative, self-starting Marketing guru who loves helping IT providers design and...
From Ingram Micro Cloud - Fri, 28 Sep 2018 07:14:09 GMT - View all Bellevue, WA jobs
          (USA-MI-Traverse City) Software Engineer - VB.Net, C#, Test Driven Development      Cache   Translate Page      
Software Engineer - VB.Net, C#, Test Driven Development Software Engineer - VB.Net, C#, Test Driven Development - Skills Required - VB.Net, C#, Test Driven Development, Azure, SQL Server, MySQL, SqlLite, Agile, SCRUM, Devops If you are a Software Engineer with experience, please read on! Located in Traverse City, MI, we are a successful healthcare company that has operated for decades with multiple locations nationwide! Our goal as an organization is to provide the best experience possible in health care services. **What You Will Be Doing** The Software Engineer will be responsible for supporting and enhancing software applications, troubleshoot issues, and oversee the code review process. Will oversee the software documentation phase along with serving as a point person for technical issues. **What You Need for this Position** At Least 3 Years of experience and knowledge of: - VB.Net - C# - Test Driven Development - Azure - SQL Server - MySQL - SqlLite - Agile - SCRUM - DevOps **What's In It for You** - Competitive Base Salary (DOE) - Full Health Benefits - 401(k) - Career Growth and Development - Company Perks - Paid Time off (PTO) So, if you are a Software Engineer with experience, please apply today! Applicants must be authorized to work in the U.S. **CyberCoders, Inc is proud to be an Equal Opportunity Employer** All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, or any other characteristic protected by law. **Your Right to Work** – In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification document form upon hire. *Software Engineer - VB.Net, C#, Test Driven Development* *MI-Traverse City* *PP4-1492998*
          (USA-IN-Carmel) Front-End/Mobile Developer - IoT      Cache   Translate Page      
Front-End/Mobile Developer - IoT Front-End/Mobile Developer - IoT - Skills Required - Mobile Development, IOS, Android, IOT, Bluetooth-low energy, Xamarin, Front-End Development, .Net Stack, Azure Cloud, Angular.js If you are a Front-End/Mobile Developer - IoT with experience, please read on! **Top Reasons to Work with Us** We are a cutting-edge IoT software and services company that provides next-generation solutions that leverage software and cloud-based infrastructure. We deliver and manage enterprise-grade IoT applications around a diverse set of customers, use cases, and industry verticals. **What You Will Be Doing** We are looking for a front-end/mobile developer to spearhead the client-facing components of our IoT applications. Responsibilities: - Develop applications consisting of both web and mobile components - Develop and maintain Azure cloud applications utilizing the full .Net development stack working with the user interface elements of a particular IoT Application. - Documentation and architectural diagrams for team sharing - Work within a highly collaborative team working on cutting edge IoT solutions utilizing modern Agile methodologies as a process framework **What You Need for this Position** MUST HAVE: - BS in graphic arts or related field - 2+ years of Mobile/Front-end application development experience - Mobile development: iOS and/or Android - Xamarin - C#/.NET Stack application development experience - Azure experience - Angular.js NICE TO HAVE: - IoT industry experience - Bluetooth-low energy experience/NFC - Adobe Photoshop - Git - Azure IoT Hub - SQL/NoSQL **What's In It for You** - Competitive Salary DOE - 401(k) with match - Comprehensive Benefits Package - Generous PTO + paid holidays So, if you are a Front-End/Mobile Developer - IoT with experience, please apply today! Applicants must be authorized to work in the U.S. **CyberCoders, Inc is proud to be an Equal Opportunity Employer** All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, or any other characteristic protected by law. **Your Right to Work** – In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification document form upon hire. *Front-End/Mobile Developer - IoT* *IN-Carmel* *KH4-1492791*
          (USA-WI-Janesville) ASP.NET Developer      Cache   Translate Page      
ASP.NET Developer ASP.NET Developer - Skills Required - ASP.NET, C#, Entity, Razor, SQL, HTML5, Azure, AngularJS, JQuery Headquartered in Janesville, WI, we are a growing SaaS based insurance administration platform provider. Offering an excellent career opportunity for the right developers! We are looking for energetic, fun and positive people that are motivated and driven to succeed. We need candidates that thrive in a fast-paced changing environment, people that can pivot as needed. **Top Reasons to Work with Us** We are offering a salary, project completion bonuses, bonus plan, an attractive benefits package and more. **What You Need for this Position** At Least 3 Years of experience and knowledge of: - ASP.NET - C# - Entity or Razor - SQL - JavaScript - JQuery - HTML5 **What's In It for You** - Vacation/PTO - Medical - Dental - Vision - Relocation - Bonus - 401k So, if you are a ASP.NET Developer with experience, please apply today! Applicants must be authorized to work in the U.S. **CyberCoders, Inc is proud to be an Equal Opportunity Employer** All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, or any other characteristic protected by law. **Your Right to Work** – In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification document form upon hire. *ASP.NET Developer* *WI-Janesville* *JW1-1492744*
          Microsoft Azure Machine Learning and Project Brainwave – Intel Chip Chat – Episode 610      Cache   Translate Page      
In this Intel Chip Chat audio podcast with Allyson Klein: In this interview from Microsoft Ignite, Dr. Ted Way, Senior Program Manager for Microsoft, stops by to talk about Microsoft Azure Machine Learning, an end-to-end, enterprise grade data science platform. Microsoft takes a holistic approach to machine learning and artificial intelligence, by developing and deploying [...]
          (USA-IN-South Bend) Senior Systems Engineer - Azure DevOps      Cache   Translate Page      
Senior Systems Engineer - Azure DevOps Senior Systems Engineer - Azure DevOps - Skills Required - Microsoft Azure, Automation / Configuration / Powershell, C# / .NET, Puppet / Chef / DSC, Microsoft Team Foundation, Team Foundation Build / Team City / Jenkins, Agile / SCRUM / TDD --This is a remote position-- If you are a Azure Cloud Engineer OR a DevOps Engineer with experience on Azure using C#, please read on! We are an innovative company, a leader at what we do and have been around for well over a decade! Join a team of positive, intelligent and dedicated developers using leading technology to produce innovative solutions that save customers thousands of dollars. Be a valued participate in a stimulating and well managed workplace with great perks!! We are looking for an Engineer to manage our Azure infrastructure so that we can continue grow at a rapid pace. The DevOps Engineer will work closely with our development teams to develop new products! As the champion for rapid deployments and product stability, we look to you to help us build, test, and release quality software through automation and performance monitoring utilizing Azure's Platform as a Service technologies. **Top Reasons to Work with Us** Join a team of positive, intelligent and dedicated developers using leading technology to produce innovative solutions saving customers thousands of dollars and untold hours of effort --This is a remote position-- **What You Need for this Position** - Bachelor's degree, preferably in Computer Science or Information Systems - Experience working in an Azure environment, PaaS preferred - Experience with automation/configuration management using PowerShell - Working knowledge of .Net (C# preferred) - Working knowledge of SQL Server Administration including TSQL - Preferred: Experience with a Continuous Integration or build platform such as Team Foundation Build (TFS Build), Team City, Jenkins, Cruise Control, Visual Studio Team Services (Visual Studio Online), Agile/Scrum/XP(Extreme Programming) and TDD (Test driven development) - Exposure to Puppet, Chef or DSC (Desired State Configuration) a plus --This is a REMOTE position-- **What You Will Be Doing** - Develop, deploy, support and maintain global application production environments - Improve performance, lower costs, and deploy products in support of our quickly growing customer base - Design and maintain production monitoring systems - Seek opportunities to streamline standard operating procedures through automation, lower costs and improve performance - Analyze architectural and feature specs and translate them into infrastructure needs **What's In It for You** - Huge growth potential - Competitive base salary - Small team environment - Great benefits --This is a remote position-- So, if you are a Azure Cloud Engineer OR a DevOps Engineer with strong Azure experience and C#.NET, please apply today! Applicants must be authorized to work in the U.S. **CyberCoders, Inc is proud to be an Equal Opportunity Employer** All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, protected veteran status, or any other characteristic protected by law. **Your Right to Work** – In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification document form upon hire. *Senior Systems Engineer - Azure DevOps* *IN-South Bend* *AW-1492799*
          Graph database implementation with Azure Cosmos DB using the API      Cache   Translate Page      

In my previous article, I’ve discussed a lot about the Graph database implementation with SQL Server 2017 . In this case, we’ll see a walk-through of Graph API integration with Azure Cosmos DB.

Before we jump into the concepts though, let’s take a high-level overview of NoSQL databases. A NoSQL database is designed in such a way that no extra efforts are needed for the database to be distributed because NoSQL Database designed that way.

Note: In one of my previous article, I talked about differences between SQL vs NoSQL

. I would recommend reading it for the better understanding of NoSQL concepts.

What is Azure Cosmos DB?

Having explained the basic characteristics of a NoSQL database, we can now take a look at what Azure Cosmos DB is all about. In sort, it is an extension of Document DB, which has been Microsoft’s NoSQL document database running on Microsoft Azure since 2015. Cosmos DB capabilities started with Document DB, which was already delivering low latency and high availability for schema-free JSON documents.

Azure Cosmos DB is Microsoft’s globally distributed, fully managed, multi-model database service with global distribution of the data, elastic scaling, automatic tuning, and provides querying capabilities, and it also support Gremlin standard. You can quickly create and query document, key/value, and graph databases, all of which benefit from the global distribution and horizontal scale capabilities at the core of Azure Cosmos DB. It also provides the ability to use multiple models like document and graph over the same data. For example, you can use a document collection to store graph data side by side with documents, and use both SQL queries over JSON and Gremlin queries to query the collection.

Prerequisites Requires an Azure subscription If you don’t have one, create a free Azure subscription account Basic knowledge of Graph databases Gremlin query Basic understanding of JSON documents Getting started

Let’s jump in and get started.

The following step by step details gives you the required information to understand the concepts of the design of Azure Cosmos DB. In this section, let us a take a look at the ways to create, query, and traverse the Graph database models.

To login to the Azure portal, browse portal.azure.com and enter the required credentials To create the Azure Cosmos DB, on the left part of the pane, click + New button and type the search string “Cosmos” to lookup for Azure Cosmos DB component
Graph database implementation with Azure Cosmos DB using the API
Click the Create button at the bottom of the screen
Graph database implementation with Azure Cosmos DB using the API
In theAzure Cosmos DB New accountform, enter the required configuration for the Azure Cosmos DB account
The supported programming models are: Key-Value Column family Documents Graph In the New Account page, enter the settings for the new Azure Cosmos DB account

ID

unique name

The unique name graphdemo will be used to identify this Azure Cosmos DB account. The URI, graphdemodb.documents.azure.com is a new unique identifiable ID.

API

Gremlin (graph)

The GraphAPI, Gremlin (graph) is selected out of five APIs suite.

Subscription

Your subscription

Select Azure subscription. In this case its pay-as-you-go subscription is selected. This is type that is used for Azure Cosmos DB account.

Resource group

Enter the Resource Group

Enter the new or existing resource group. In this case, the new resource group called graphresource is created.

Location

Select the nearest location

Use the closest location that gives an optimal performance gain to access the data.

Enable geo-redundancy

Check/un-check

To create a replicated version of the database over a second (paired) region.

Pin-to-dashboard

Select

This option adds the database to the dashboard for easy access.

ClickCreate
Graph database implementation with Azure Cosmos DB using the API
The account creation step may take a few minutes
Graph database implementation with Azure Cosmos DB using the API
Use the Data Explorer tool in the Azure portal to create a Graph database. Let us select the Quick start option and click the Create ‘Persons’ container
Graph database implementation with Azure Cosmos DB using the API
Follow the below steps to create and query vertices in a Graph database under the ‘Persons’ container. This can be done in two ways: The first, using GUI. Click the New Vertex button; this will open up a form to enter the properties of the vertices. Once done, click Ok button to create the vertices. The interface interprets the entered values as Gremlin commands. This you can see in the message pane at the bottom
Graph database implementation with Azure Cosmos DB using the API
The second, using Gremlin command. Let’s enter the following command in the Query filter section and click Apply filter to run the Gremlin query

g.addV(‘person’).property(‘firstName’, ‘Prashanth’).property(‘lastName’, ‘Jayaram’).property(‘age’, 35).property(‘skillset’, ‘SQL’)


Graph database implementation with Azure Cosmos DB using the API

You can run g.V().count() command to list all the vertices


Graph database implementation with Azure Cosmos DB using the API
Next, let’s create two more vertices for the input values Brian Lockwood with SQL as his skillset and Samir Behara with. Net as his skillset g.addV(‘person’).property(‘firstName’, ‘Brian’).property(‘lastName’, ‘Lockwood’).property(‘age’, 50).property(‘skillset’, ‘SQL’) g.addV(‘person’).property(‘firstName’, ‘Samir’).property(‘lastName’, ‘Behara’).property(‘age’, 35).property(‘skillset’, ‘.Net’) Let’s run a simple command to list all the vertices. Type the following g.V() and click Apply filter
Graph database implementation with Azure Cosmos DB using the API
Let’s try simple queries to list the only person using the following query. You can also view the JSON document by selecting the JSON tab

g.V().hasLabel(‘person’).has(‘firstName’, ‘Prashanth’)


Graph database implementation with Azure Cosmos DB using the API
To get the count of graphs use the simple g.V().count() command

          System Cloud Engineer      Cache   Translate Page      
Namens Layer ben ik per direct op zoek naar een System Cloud Engineer in Amsterdam voor mijn directe eindklant! Deze succesvolle Cloud organisatie heeft op dit moment echt de behoefte aan een specialist met veel kennis van Exchange en Azure die de volledige IaaS-omgeving kan ondersteunen. Je werkzaamheden zijn het ontwikkelen van de technische oplossingen die een meerwaarde bieden voor de activiteiten van de klant...
          Acquired Treasure Chest      Cache   Translate Page      
Begin your internship by acquiring a free pack of goods at the Rewards Claim Agent! The Acquired Treasure Chest will be available starting November 6 until December 6. In this pack (only one per account), adventurers will find: Items to help with the current Campaign: Wooden Tokens x50 Acorns x50 Swag Bag   Coupons for the Zen Store Limited Time -10% off Any Pack Limited Time - Free Stone of Health   Items to aid in increasing the power of your equipment: Refiner’s Cache Blood Ruby Silvery Enchantment, Rank 7 Dark Enchantment, Rank 7 Azure Enchantment, Rank 7 Radiant Enchantment, Rank 7 Preservation Wards x5    .share-link a { margin: 0 10px 0 0; background: url(https://pwimages-a.akamaihd.net/arcportal_static/images/global/icon-social.png) no-repeat 0 0; display: inline-block; height: 30px; width: 30px; -webkit-box-shadow: 0 0 1px rgba(255,255,255,.9); -moz-box-shadow: 0 0 1px rgba(255,255,255,.9); box-shadow: 0 0 1px rgba(255,255,255,.9); -webkit-border-radius: 2px; -moz-border-radius: 2px; border-radius: 2px; padding: 0; border: 1px solid transparent; } .share-link .youtube { background-position: -60px 0; } .share-link .twitter { background-position: -30px 0; } .share-footer a { margin: 20px 10px 10px 0; background: url(https://pwimages-a.akamaihd.net/arc/78/37/7837b2f61b4e186e6e19241cd6fe43491466535696.png) no-repeat 0 0; opacity: .5; display: inline-block; height: 60px; width: 60px; padding: 0; border: 1px solid transparent; -webkit-transition: opacity .3s; transition: opacity .3s; } .share-footer .TI { background-position: 0 20%; } .share-footer .YT { background-position: 0 40%; } .share-footer .YT { background-position: 0 40%; } .share-footer .TW { background-position: 0 60%; } .share-footer .FO { background-position: 0 80%; } .FB:hover { background-position: 100% 0%; } .TI:hover { background-position: 100% 20%; } .YT:hover { background-position: 100% 40%; } .TW:hover { background-position: 100% 60%; } .FO:hover { background-position: 100% 80%; } .share-footer a:hover { opacity: 1; -webkit-transition: opacity .3s; transition: opacity .3s; }
          Free instant Approval Dofollow Directory Submission Sites List 2019      Cache   Translate Page      
I am share “High PR, PA, DA Instant Approval Free Directory submission list 2019 with you. With few simple steps, you can get the top one way backlinks for your websites, blog through these directory site – Free Instant Approval High PA, DA directory submission Sites List 2019.

colorblossomdirectory.com
darkschemedirectory.com
blackandbluedirectory.com
blackgreendirectory.com
bluebook-directory.com
bluesparkledirectory.com
brownedgedirectory.com
alive2directory.com
arcticdirectory.com
aurora-directory.com
azure-directory.com
bizz-directory.com
gowwwlist.com
johnnylist.org
webguiding.net
onecooldir.com
1directory.org
authorizeddir.com
propellerdir.com
Old and established directories with DA & PA over 40
fire-directory.com
alivelinks.org
asklink.org
businessfreedirectory.biz
targetlink.biz
sublimelink.org
hotlinks.biz
prolink-directory.com
alivelink.org
justdirectory.org
trafficdirectory.org
unique-listing.com
angelsdirectory.com
relevantdirectories.com
efdir.com
ifidir.com
piratedirectory.org
relateddirectory.org
relevantdirectory.biz
populardirectory.biz
directory10.biz
directory4.org
directory6.org
populardirectory.org
royaldirectory.biz
directory8.org
directory10.org
directory9.biz
directory5.org
directory3.org
directorydirect.net
globaldir.org
nicedir.net
smartdir.org
toptendir.net
homedirectory.biz
classdirectory.org
directdirectory.org
harddirectory.net
steeldirectory.net
jet-links.com
ad-links.org
freeweblink.org
ask-dir.org
link-boy.org
free-weblink.com
freeseolink.org
justlink.org
link-man.org
smartseolink.org
www.addgoodsites.com
www.alive-directory.com
www.acedirectory.org
www.bestdirectory4you.com
www.one-sublime-directory.com
www.activdirectory.net
www.a2place.com
www.abstractdirectory.net
www.aweblist.org
www.bedirectory.com
www.adbritedirectory.com
www.hotdirectory.net
www.addirectory.org
www.beegdirectory.com
www.clicksordirectory.com
www.huludirectory.com
www.sublimedir.net
www.poordirectory.com
www.ask-directory.com
www.craigslistdirectory.net
www.upsdirectory.com
www.bing-directory.com
www.interesting-dir.com
www.aquarius-dir.com
www.facebook-list.com
www.ebay-dir.com
www.bestbuydir.com
www.target-directory.com
www.familydir.com
www.afunnydir.com
www.backpagedir.com
www.exampledir.com
www.lemon-directory.com
www.seooptimizationdirectory.com
www.domainnamesseo.com
www.craigslistdir.org
www.searchdomainhere.com
www.mediafiredirectlink.com
www.directoryanalytic.com
www.linkedin-directory.com
www.ecodir.net/
www.advancedseodirectory.com
www.apeopledirectory.com
www.businessfreedirectory.com
www.411freedirectory.com
www.reddit-directory.com

          Développeur Azure - Technologie Delan - Montréal, QC      Cache   Translate Page      
Le Développeur Azure sera responsable de participer à la conception, au déploiement et l’amélioration continue des applications de l’entreprise PLUS...
From Technologie Delan - Tue, 09 Oct 2018 17:09:05 GMT - View all Montréal, QC jobs
          Développeur Big Data - EXIA - Montréal, QC      Cache   Translate Page      
Posséder de l’expérience sur Microsoft Azure / HD Insight (Hortonworks) ; Vous êtes passionné par le Big Data, vous êtes familier avec les écosystèmes Pig, Hive...
From EXIA - Tue, 18 Sep 2018 11:29:35 GMT - View all Montréal, QC jobs
          Test-Retest Variability in Structural and Functional Parameters of Glaucoma Damage in the Glaucoma Imaging Longitudinal Study      Cache   Translate Page      
imagePurpose To determine the test-retest variability in perimetric, optic disc, and macular thickness parameters in a cohort of treated patients with established glaucoma. Patients and Methods In this cohort study, the authors analyzed the imaging studies and visual field tests at the baseline and 6-month visits of 162 eyes of 162 participant in the Glaucoma Imaging Longitudinal Study (GILS). They assessed the difference, expressed as the standard error of measurement, of Humphrey field analyzer II (HFA) Swedish Interactive Threshold Algorithm fast, Heidelberg retinal tomograph (HRT) II, and retinal thickness analyzer (RTA) parameters between the two visits and assumed that this difference was due to measurement variability, not pathologic change. A statistically significant change was defined as twice the standard error of measurement. Results In this cohort of treated glaucoma patients, it was found that statistically significant changes were 3.2 dB for mean deviation (MD), 2.2 for pattern standard deviation (PSD), 0.12 for cup shape measure, 0.26 mm2 for rim area, and 32.8 μm and 31.8 μm for superior and inferior macular thickness, respectively. On the basis of these values, it was estimated that the number of potential progression events detectable in this cohort by the parameters of MD, PSD, cup shape measure, rim area, superior macular thickness, and inferior macular thickness was 7.5, 6.0, 2.3, 5.7, 3.1, and 3.4, respectively. Conclusions The variability of the measurements of MD, PSD, and rim area, relative to the range of possible values, is less than the variability of cup shape measure or macular thickness measurements. Therefore, the former measurements may be more useful global measurements for assessing progressive glaucoma damage.
          Parallels выпустила новую версию Remote Application Server 16.5.1      Cache   Translate Page      
Грандиозная скидка на процессор Intel Core i3 7350KКомпания Parallels выпустила последнюю версию Parallels Remote Application Server 16.5.1, позиционируемого в качестве экономичного решения для доставки приложений и организации VDI.Среди новых функций Parallels RAS 16.5.1 можно отметить мастера миграции с Citrix XenApp 6.x, возможность автоматического предоставления и масштабирования хостов RDSH, отображения требований к сложности пароля, многофакторную (MFA) и двухфакторную (2FA) проверку подлинности Azure, публикацию приложений Microsoft App-V и усовершенствования VDI.Кроме того, разработчик заметно улучшил пользовательский интерфейс консоли Parallels RAS, дав возможность просматривать показатели производительности в реальном времени с помощью информационной панели в представлении сайта, а также добавил функцию более гибкой настройки клиентских устройств через RAS-политики (20 категорий вместо 4) и функцию ограничения максимально допустимого числа одновременно работающих пользователей.Parallels предлагает для тестов 30-дневную полнофункциональную версию RAS 16.5.1 на 50 подключений. Parallels Remote Application Server предлагается по подписке и лицензируется по количеству одновременных подключений. Цена одного подключения для новых пользователей - 6122,5 рублей в год, при миграции с аналогичных решений - 3061,6 рублей в год.Кроме того, поставщики управляемых услуг (MSP) и независимые поставщики ПО (ISV), которые специализируются на доставке приложений и сервисах VDI, могут получить дополнительные преимущества, приняв участие в партнерской программе Parallels.Ранее редакция THG.ru опубликовала статью о сравнении Parallels RAS с альтернативными технологиями. В этой статье мы сравненим инструмент для доставки приложений и виртуальных рабочих столов Parallels RAS с альтернативными технологиями. Подробнее об этом читайте в статье "Сравнение Parallels RAS с альтернативными технологиями ".
          Faith Community Nursing: Supporting Healthy People 2020 Initiatives      Cache   Translate Page      
imageNo abstract available
          Azure Big Data Engineer - Accenture - Milwaukee, WI      Cache   Translate Page      
Minimum 6 monthsof designing, building and operationalizing large scaleenterprise data solutionsand applications using one or more of AZUREdata and analytics...
From Indeed - Thu, 25 Oct 2018 11:15:29 GMT - View all Milwaukee, WI jobs
          Principal Data & Applied Scientist - Microsoft - Redmond, WA      Cache   Translate Page      
Azure SQL DW, Redshift, Snowflake. Join the Microsoft Identity division and use petabytes of data to drive data-driven strategic decisions that directly impact...
From Microsoft - Sat, 13 Oct 2018 02:49:40 GMT - View all Redmond, WA jobs
          Technical Specilist - Brillio - Redmond, WA      Cache   Translate Page      
Strong experience in Azure ecosystem such as HDInsight, Azure Data Factory, Azure Data Lake and SQL DW. Job description would be as follows:....
From Brillio - Tue, 25 Sep 2018 23:59:03 GMT - View all Redmond, WA jobs
          Senior Software Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
Azure SQL DB and Azure SQL DW are growing at massive scales with many key enterprise customers already onboarded to both....
From Microsoft - Fri, 24 Aug 2018 20:23:36 GMT - View all Redmond, WA jobs
          MS Azure Cloud Engineer      Cache   Translate Page      
VA-Arlington, Kavaliro is currently seeking a Cloud Engineer with a strong focus on MS Azure. The selected candidate will be involved with transforming development and testing environments from on-premises to cloud-based. The candidate will work closely with stakeholders to gather technical requirements, architect solutions and execute on deliverables. Experience requirements: • 5 or more years of experience do
          Architect - Microsoft - Redmond, WA      Cache   Translate Page      
Preferred experience with at least one of the Machine Learning related technologies (SAS, SPSS, RevR, Azure ML, MapR)....
From Microsoft - Thu, 23 Aug 2018 08:25:24 GMT - View all Redmond, WA jobs
          #260 : Devcon4 Recap - From HODL to BUIDL      Cache   Translate Page      

It's that time of year again, Devcon time! Just as the conference was wrapping up, we sat down at the Prague Congress Center for a Devcon4 recap. Join Brian, Friederike, Sebastien, and a surprise guest host as we share our thoughts on the event, and how the Ethereum ecosystem has evolved since we last met in Cancun.

Topics discussed in this episode:

  • Devcon4 compared to Devcon3
  • Changes to the Ethereum Foundation management
  • Serenity (Ethereum 2.0) roadmap
  • The shift from HODL to BUIDL
  • The apparent departure of ICOs from the landscape
  • The growth of grant programs
  • Brian's talk at the Generalized Mining panel
  • This year's emphasis on security
  • Web3 Summit in Berlin the previous week

Links mentioned in this episode:

Sponsors:

  • Toptal: Simplify your hiring process & access the best blockchain talent' '" Get a $1,000 credit on your first hire
  • Azure: Deploy enterprise-ready consortium blockchain networks that scale in just a few clicks

Support the show, consider donating:

This episode is also available on :

Watch or listen, Epicenter is available wherever you get your podcasts.

Epicenter is hosted by Brian Fabian Crain, Sƒbastien Couture, Meher Roy & Sunny Aggarwal.


          Offer - Best Project Oriented Video Training On MS SQL DBA - AUSTRALIA      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learning Training, Oracle PL SQL Training. We have been providing Classroom Training, Live-Online Training, On Demand Video Training and Corporate trainings. All our training sessions are COMPLETELY PRACTICAL. SQL SERVER DBA Video Training Course Details: Real time training on SQL Server 2016 & 2017 DB Design and T-SQL. This training course is exclusively designed addressing all practical aspects of SQL Server fundamentals required for SQL DBA and Business Intelligence (MSBI) implementations. Material provided during the course. All Sessions are Completely Practical and Realtime. For free SQL Server DBA Video Demo, please visit : http://sqlschool.com/SQLDBA-Video-Training.html Schedules for PRACTICAL SQL 2016 & 2017 DBA Video TRAINING : http://sqlschool.com/Register.html Contact us today (24 x 7) for SQL DBA Practical Video Training SQL School Training Institute ISO 9001:2008 Certified Organization for Training Authorized Microsoft Partner (ID# 5108842) India: Mobile: +91 (0) 9666 44 0801 Mobile: +91 (0) 9666 64 0801 USA: Office: +1 (510) 400-4845 Office 1: #101, UMA Residency, Opp: Sindhu Travels, Beside Metro Station Gate #D, SR Nagar, Hyderabad - 38, India. Office 2: #202, Sai Anu Avenue, Street #3, Patrika Nagar, HITECH City, Hyderabad -81, India. . Website: http://sqlschool.com/ Follow us: https://www.facebook.com/sequelschool https://www.linkedin.com/company/sql-school https://twitter.com/sequelschool
          Microsoft 'Solidifies' Azure Cloud For Drones      Cache   Translate Page      
Internet of Things devices & drones that can make real-time context-aware decisions represent what we call 'the intelligent edge' -- and this will define the next wave of innovation, not just for business, but also how we address some of the world’s most pressing issues.
          Methicillin-Resistant Staphylococcus aureus Parotitis Leading to Mortality in an Adolescent Male      Cache   Translate Page      
imageA case of toxic shock syndrome associated with methicillin-resistant Staphylococcus aureus parotitis in a 13-year-old male is presented. He was initially diagnosed with left-sided parotitis by his primary care physician, was started on sulfamethoxazole/trimethoprim, and became severely ill the following day. He was transported to the hospital after a syncopal episode at home and was found to have altered mental status, hypotension, and hypoxia. He was transferred to a larger care facility and died en route despite aggressive resuscitation. At autopsy, he was found to have a severe left-sided parotitis, severe pulmonary congestion, edema, and pneumonia, as well as bilateral lower limb hemorrhagic lesions. Blood cultures from the time of admission and at autopsy grew methicillin-resistant Staphylococcus aureus, which is rarely reported as the sole cause of parotitis. In addition, although S. aureus bacteremia is not necessarily a rare complication of a parotid gland infection, it is exceedingly rare in an immunocompetent adolescent.
          Développeur Azure - Technologie Delan - Montréal, QC      Cache   Translate Page      
Le Développeur Azure sera responsable de participer à la conception, au déploiement et l’amélioration continue des applications de l’entreprise PLUS...
From Technologie Delan - Tue, 09 Oct 2018 17:09:05 GMT - View all Montréal, QC jobs
          Développeur Big Data - EXIA - Montréal, QC      Cache   Translate Page      
Posséder de l’expérience sur Microsoft Azure / HD Insight (Hortonworks) ; Vous êtes passionné par le Big Data, vous êtes familier avec les écosystèmes Pig, Hive...
From EXIA - Tue, 18 Sep 2018 11:29:35 GMT - View all Montréal, QC jobs
          Survival of Women With Type I and II Epithelial Ovarian Cancer Detected by Ultrasound Screening      Cache   Translate Page      
imageOBJECTIVE: To estimate the effect of ultrasound screening on stage at detection and long-term disease-specific survival of at-risk women with epithelial ovarian cancer. METHODS: Eligibility included all asymptomatic women 50 years of age or older and women 25 years of age or older with a documented family history of ovarian cancer. From 1987 to 2017, 46,101 women received annual ultrasound screening in a prospective cohort trial. Women with a persisting abnormal screen underwent tumor morphology indexing, serum biomarker analysis, and surgery. RESULTS: Seventy-one invasive epithelial ovarian cancers and 17 epithelial ovarian tumors of low malignant potential were detected. No women with a low malignant potential tumor experienced recurrent disease. Stage distribution for screen-detected invasive epithelial ovarian cancers was stage I—30 (42%), stage II—15 (21%), stage III—26 (37%), and stage IV—0 (0%). Follow-up varied from 9.2 months to 27 years (mean 7.9 years). Disease-specific survival at 5, 10, and 20 years for women with invasive epithelial ovarian cancer detected by screening was 86±4%, 68±7%, and 65±7%, respectively, vs 45±2%, 31±2%, and 19±3%, respectively, for unscreened women with clinically detected ovarian cancer from the same geographic area who were treated at the same institution by the same treatment protocols (P<.001). Twenty-seven percent of screen-detected malignancies were type I and 73% were type II. The disease-specific survival of women with type I and type II screen-detected tumors was significantly higher than that of women with clinically detected type I and type II tumors and was related directly to earlier stage at detection. CONCLUSION: Annual ultrasound screening of at-risk asymptomatic women was associated with lower stage at detection and increased 5-, 10-, and 20-year disease-specific survival of women with both type I and type II epithelial ovarian cancer. CLINICAL TRIAL REGISTRATION: OnCore Clinical Trials Management System, NCI-2013-01954.
          Maternal Outcomes Associated With Lower Range Stage 1 Hypertension      Cache   Translate Page      
imageOBJECTIVE: To evaluate maternal and neonatal outcomes in healthy, nulliparous women classified with stage 1 hypertension under the revised American College of Cardiology and American Heart Association Guidelines and to evaluate the effects of low-dose aspirin on maternal and neonatal outcomes in this population. METHODS: We conducted a secondary analysis of data from a multicenter randomized, double-blind, placebo-controlled trial of low-dose aspirin for prevention of preeclampsia in nulliparous, low-risk women recruited between 13 and 25 weeks of gestation. Of the 3,134 nulliparous women enrolled in the original study, 2,947 women with singleton pregnancies and without missing data were included in this analysis. Blood pressure was measured at enrollment between 13 and 25 weeks of gestation and outcomes were adjudicated from the medical record. RESULTS: One hundred sixty-four participants were identified with lower range stage 1 hypertension (5.6%), systolic blood pressure 130–135 mm Hg, diastolic blood pressure 80–85 mm Hg, or both by the new American College of Cardiology–American Heart Association guidelines. Within the placebo group (n=1,482), women with stage 1 hypertension had a significantly increased incidence of preeclampsia compared with normotensive women, 15.3% (15/98) vs 5.4% (75/1,384) (relative risk 2.66, 95% CI 1.56–4.54, P<.001). Moreover, women with stage 1 hypertension had an increased incidence of gestational diabetes mellitus (6.1% vs 2.5%, P=.03) and more indicated preterm deliveries (4.2% vs 1.1%, P=.01). Comparing women with stage 1 hypertension and normotensive women receiving low-dose aspirin during pregnancy (n=1,465), no differences in rates of preeclampsia (7.6% vs 4.4%, respectively, P=.2), gestational diabetes mellitus, or indicated preterm deliveries were observed. Rates of placenta abruption, small for gestational age, and spontaneous preterm birth did not differ significantly between groups. CONCLUSION: Application of the new American College of Cardiology–American Heart Association guidelines in a pregnant population identifies a cohort of women who are at increased risk for preeclampsia, gestational diabetes mellitus, and preterm birth.
          Outcomes of Subsequent Pregnancies After Uterine Compression Sutures for Postpartum Hemorrhage      Cache   Translate Page      
imageOBJECTIVE: To estimate the association between uterine compression sutures for postpartum hemorrhage and subsequent pregnancy outcomes. METHODS: We reviewed the medical records of 336 women who received uterine compression sutures to control postpartum hemorrhage during their first delivery at a single medical center between 2006 and 2011. Of these, 42 women who became pregnant again and received care through our hospital were included in this study. One hundred thirty-nine pregnant women matched for age and parity who did not receive uterine compression sutures during a previous cesarean delivery served as the control group. We compared subsequent pregnancy outcomes and operative findings during repeat cesarean delivery between the two groups. RESULTS: There were four (9.5%) miscarriages and one (2.4%) tubal pregnancy in the compression suture group compared with 14 (10.1%) miscarriages and two (1.5%) tubal pregnancies in the control group (P=.92 and P=.68, respectively). In the compression suture group, 34 (81.0%) women delivered at term and two (4.7%) women had preterm deliveries. In the control group, 114 (82.0%) women delivered at term and seven (5.0%) women had preterm deliveries (P=.88 and P=.60, respectively). The rate of pelvic adhesions on repeat cesarean delivery was significantly higher in the compression suture group than in the control group (34.3% compared with 17.5%, P=.03). CONCLUSIONS: Subsequent pregnancy outcomes were similar for women who did and those who did not receive uterine compression sutures during their prior delivery, whereas uterine adhesions at repeat cesarean delivery were more prevalent in women who received uterine compression sutures. LEVEL OF EVIDENCE: II
          Offer - Best Project Oriented Video Training On MS SQL DBA - USA      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learning Training, Oracle PL SQL Training. We have been providing Classroom Training, Live-Online Training, On Demand Video Training and Corporate trainings. All our training sessions are COMPLETELY PRACTICAL. SQL SERVER DBA Video Training Course Details: Real time training on SQL Server 2016 & 2017 DB Design and T-SQL. This training course is exclusively designed addressing all practical aspects of SQL Server fundamentals required for SQL DBA and Business Intelligence (MSBI) implementations. Material provided during the course. All Sessions are Completely Practical and Realtime. For free SQL Server DBA Video Demo, please visit : http://sqlschool.com/SQLDBA-Video-Training.html Schedules for PRACTICAL SQL 2016 & 2017 DBA Video TRAINING : http://sqlschool.com/Register.html Contact us today (24 x 7) for SQL DBA Practical Video Training SQL School Training Institute ISO 9001:2008 Certified Organization for Training Authorized Microsoft Partner (ID# 5108842) India: Mobile: +91 (0) 9666 44 0801 Mobile: +91 (0) 9666 64 0801 USA: Office: +1 (510) 400-4845 Office 1: #101, UMA Residency, Opp: Sindhu Travels, Beside Metro Station Gate #D, SR Nagar, Hyderabad - 38, India. Office 2: #202, Sai Anu Avenue, Street #3, Patrika Nagar, HITECH City, Hyderabad -81, India. . Website: http://sqlschool.com/ Follow us: https://www.facebook.com/sequelschool https://www.linkedin.com/company/sql-school https://twitter.com/sequelschool
          Urgent Challenges for Local Public Health Informatics      Cache   Translate Page      
imageNo abstract available
          Service Delivery Manager Automation & Release - Novartis - Hyderabad, Telangana      Cache   Translate Page      
Experienced in private cloud infrastructure (including VMWare), Amazon Web Services and Microsoft Azure. Represent IT to the business (or in the case of central...
From Novartis - Wed, 24 Oct 2018 14:34:47 GMT - View all Hyderabad, Telangana jobs
          Full Stack Developer - Neo-Hire - Hyderabad, Telangana      Cache   Translate Page      
Microsoft Azure, Amazon AWS. A Full Stack developer is responsible for front and back-end web development....
From Indeed - Tue, 06 Nov 2018 10:31:55 GMT - View all Hyderabad, Telangana jobs
          Garmin Vivosmart 4 Smart Activity Tracker Watch - 2019      Cache   Translate Page      
Garmin Vivosmart 4 Smart Activity Tracker Watch

Fits wrists with a circumference of 122-188 mm

Track Your Wellness and Fitness Activity with Style

  • Slim, smart activity tracker¹ blends fashionable design with stylish metal accents and a bright, easy-to-read display
  • Includes advanced sleep..

    Price: $129.95


          Premium blob storage      Cache   Translate Page      
As a follow-up to my blog Azure Archive Blob Storage, Microsoft has released another storage tier called Azure Premium Blob Storage (announcement).  It is in private preview in US East 2, US Central and US West regions. This is a performance … Continue reading
          Remote Azure Cloud Solutions Architect      Cache   Translate Page      
An experienced Microsoft software and services corporation has an open position for a Remote Azure Cloud Solutions Architect. Core Responsibilities Include: Leading requirements gathering, analysis and solution development Deploying complex application workloads on MS Azure Architecting backup, and disaster recovery solutions on MS Azure Required Skills: 10+ years overall industry experience 5+ years of experience with demonstrated Cloud architecture projects 5+ years of experience with Cloud solutions, platforms, and technologies 3+ years of experience across the business development lifecycle Extensive working experience with Microsoft technologies
          Architect - Microsoft - Redmond, WA      Cache   Translate Page      
Preferred experience with at least one of the Machine Learning related technologies (SAS, SPSS, RevR, Azure ML, MapR)....
From Microsoft - Thu, 23 Aug 2018 08:25:24 GMT - View all Redmond, WA jobs
          Download 2018 Latest AZ-200 Dumps - Exam Questions PDF      Cache   Translate Page      
By taragill    In Education    11 minutes ago
Are you going to appear in the Microsoft Azure Developer Core Solutions certification exam this year? As an IT student, you should prepare yourself with some authentic study material. Because it is necessary AZ-200 exam in the first attempt, Download Real Microsoft AZ-200 dumps. Which are based on actual AZ-200 exam practice questions. To download AZ-200 braindumps pdf visit https://www.braindumpspdf.com/exam/Az-200.html
Tags: az-200, exam, pdf, dumps

          Offer - Best Project Oriented Video Training On MS SQL DBA - AUSTRALIA      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learning Training, Oracle PL SQL Training. We have been providing Classroom Training, Live-Online Training, On Demand Video Training and Corporate trainings. All our training sessions are COMPLETELY PRACTICAL. SQL SERVER DBA Video Training Course Details: Real time training on SQL Server 2016 & 2017 DB Design and T-SQL. This training course is exclusively designed addressing all practical aspects of SQL Server fundamentals required for SQL DBA and Business Intelligence (MSBI) implementations. Material provided during the course. All Sessions are Completely Practical and Realtime. For free SQL Server DBA Video Demo, please visit : http://sqlschool.com/SQLDBA-Video-Training.html Schedules for PRACTICAL SQL 2016 & 2017 DBA Video TRAINING : http://sqlschool.com/Register.html Contact us today (24 x 7) for SQL DBA Practical Video Training SQL School Training Institute ISO 9001:2008 Certified Organization for Training Authorized Microsoft Partner (ID# 5108842) India: Mobile: +91 (0) 9666 44 0801 Mobile: +91 (0) 9666 64 0801 USA: Office: +1 (510) 400-4845 Office 1: #101, UMA Residency, Opp: Sindhu Travels, Beside Metro Station Gate #D, SR Nagar, Hyderabad - 38, India. Office 2: #202, Sai Anu Avenue, Street #3, Patrika Nagar, HITECH City, Hyderabad -81, India. . Website: http://sqlschool.com/ Follow us: https://www.facebook.com/sequelschool https://www.linkedin.com/company/sql-school https://twitter.com/sequelschool
          Développeur Azure - Technologie Delan - Montréal, QC      Cache   Translate Page      
Le Développeur Azure sera responsable de participer à la conception, au déploiement et l’amélioration continue des applications de l’entreprise PLUS...
From Technologie Delan - Tue, 09 Oct 2018 17:09:05 GMT - View all Montréal, QC jobs
          Développeur Big Data - EXIA - Montréal, QC      Cache   Translate Page      
Posséder de l’expérience sur Microsoft Azure / HD Insight (Hortonworks) ; Vous êtes passionné par le Big Data, vous êtes familier avec les écosystèmes Pig, Hive...
From EXIA - Tue, 18 Sep 2018 11:29:35 GMT - View all Montréal, QC jobs
          Rare Craniofacial Clefts: A Surgical Classification      Cache   Translate Page      
imageThe Tessier classification is the current standard for identifying and reporting rare craniofacial clefts. This numerically based system describes 16 different primary clefts, with additional possible combinations that can significantly raise the total number of potentially describable clefts. Problems with this system include a complexity that requires most surgeons to consult a diagram to describe the location of a cleft. In addition, the Tessier classification can include conditions that may not actually involve a true cleft such as frontonasal dysplasia, Treacher Collins syndrome, and craniofacial microsomia. A surgically based classification is proposed that includes only true clefts (eliminating hyperplasias, hypoplasias, and aplasias) and classifies these rare anomalies into 1 of 4 types based on anatomic regions: midline, median, orbital, and lateral. This simplified classification for craniofacial clefts, which is based on a different surgical paradigm appropriate to each regional location, enables surgeons to describe an observed cleft in such a way that others can easily visualize the location and have a starting point for formulating treatment decisions.
          VMware併購新創公司Heptio,將Kubernetes專案共同發起人納入麾下      Cache   Translate Page      
虛擬化龍頭VMware正式押寶Kubernetes的起點,可以從VMworld與Google、Pivotal聯手推出PKS(Pivotal Container Service)算起,而在今年6月時,還推出了自家K8s引擎VMware Kubernetes Engine(VKE),首先支援AWS,未來要擴張至Azure等公有雲。
          Offer - Best Project Oriented Video Training On MS SQL DBA - AUSTRALIA      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learning Training, Oracle PL SQL Training. We have been providing Classroom Training, Live-Online Training, On Demand Video Training and Corporate trainings. All our training sessions are COMPLETELY PRACTICAL. SQL SERVER DBA Video Training Course Details: Real time training on SQL Server 2016 & 2017 DB Design and T-SQL. This training course is exclusively designed addressing all practical aspects of SQL Server fundamentals required for SQL DBA and Business Intelligence (MSBI) implementations. Material provided during the course. All Sessions are Completely Practical and Realtime. For free SQL Server DBA Video Demo, please visit : http://sqlschool.com/SQLDBA-Video-Training.html Schedules for PRACTICAL SQL 2016 & 2017 DBA Video TRAINING : http://sqlschool.com/Register.html Contact us today (24 x 7) for SQL DBA Practical Video Training SQL School Training Institute ISO 9001:2008 Certified Organization for Training Authorized Microsoft Partner (ID# 5108842) India: Mobile: +91 (0) 9666 44 0801 Mobile: +91 (0) 9666 64 0801 USA: Office: +1 (510) 400-4845 Office 1: #101, UMA Residency, Opp: Sindhu Travels, Beside Metro Station Gate #D, SR Nagar, Hyderabad - 38, India. Office 2: #202, Sai Anu Avenue, Street #3, Patrika Nagar, HITECH City, Hyderabad -81, India. . Website: http://sqlschool.com/ Follow us: https://www.facebook.com/sequelschool https://www.linkedin.com/company/sql-school https://twitter.com/sequelschool
          DevOps Engineer - Amazon Web Services, Cloud, Azure      Cache   Translate Page      
OH-Columbus, If you are a DevOps Engineer with experience, please read on! Located in Columbus, OH, we are a privately held start-up company in the healthcare industry with a dynamic culture! We have operated for close to a decade and have been recognized as one of the top start-ups in the area. Our goal is to give healthcare partners access to the best possible product and technology on the market. What You W
          2017 ASTHO President's Challenge: Public Health Approaches to Preventing Substance Misuse and Addiction      Cache   Translate Page      
imageNo abstract available
          Offer - Best Project Oriented Video Training On MS SQL DBA - AUSTRALIA      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learning Training, Oracle PL SQL Training. We have been providing Classroom Training, Live-Online Training, On Demand Video Training and Corporate trainings. All our training sessions are COMPLETELY PRACTICAL. SQL SERVER DBA Video Training Course Details: Real time training on SQL Server 2016 & 2017 DB Design and T-SQL. This training course is exclusively designed addressing all practical aspects of SQL Server fundamentals required for SQL DBA and Business Intelligence (MSBI) implementations. Material provided during the course. All Sessions are Completely Practical and Realtime. For free SQL Server DBA Video Demo, please visit : http://sqlschool.com/SQLDBA-Video-Training.html Schedules for PRACTICAL SQL 2016 & 2017 DBA Video TRAINING : http://sqlschool.com/Register.html Contact us today (24 x 7) for SQL DBA Practical Video Training SQL School Training Institute ISO 9001:2008 Certified Organization for Training Authorized Microsoft Partner (ID# 5108842) India: Mobile: +91 (0) 9666 44 0801 Mobile: +91 (0) 9666 64 0801 USA: Office: +1 (510) 400-4845 Office 1: #101, UMA Residency, Opp: Sindhu Travels, Beside Metro Station Gate #D, SR Nagar, Hyderabad - 38, India. Office 2: #202, Sai Anu Avenue, Street #3, Patrika Nagar, HITECH City, Hyderabad -81, India. . Website: http://sqlschool.com/ Follow us: https://www.facebook.com/sequelschool https://www.linkedin.com/company/sql-school https://twitter.com/sequelschool
          Azure, Phil D.      Cache   Translate Page      
PHIL D. AZURE February 5, 1925 ˜ November 2, 2018 Phil D. Azure passed away November 2, 2018. Phil was born at Beauchamp, Montana...
          Azure Big Data Engineer - Accenture - Milwaukee, WI      Cache   Translate Page      
Minimum 6 monthsof designing, building and operationalizing large scaleenterprise data solutionsand applications using one or more of AZUREdata and analytics...
From Indeed - Thu, 25 Oct 2018 11:15:29 GMT - View all Milwaukee, WI jobs
          Real Time Live Online Training On SSRS @ SQL School - Delhi, India      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learni...
          8 great tools that make Docker better      Cache   Translate Page      

Blink and you might miss some of the most interesting developments around Docker these days. Aside from progress on Docker itself, many other useful projects have been built on top of Docker or been empowered by Docker. In many cases, these tools take advantage of workflow techniques and deployment strategies that Docker makes possible.

Here are eight open-source creations that get a boost from Docker or give Docker a boost, leveraging Docker for specific use cases or making Docker easier to work with.

Dusty

A Docker-powered, MIT-licensed development environment, Dusty is intended to improve on the use of Docker Compose or Vagrant for managing containers. The developers behind Dusty claim, for example, that Dusty has a simpler specs model than Docker Compose, and that it handles version-based isolation of app dependencies and updates of services better than Vagrant. Dusty also allows tests to be created as part of a spec for an environment, and makes it possible for common multi-step procedures to be made into an easily invoked script.

To read this article in full, please click here

(Insider Story)
          Best Project Oriented Video Training On MS SQL DBA - Ahmedabad, India      Cache   Translate Page      
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learn...
          QA Analyst II - 262611 - Procom - Regina, SK      Cache   Translate Page      
3-5 experience developing and running test scripts using tool sets like Microsoft Test or Visual Studio or Azure DevOps Test Hub (i.e....
From Procom - Mon, 22 Oct 2018 15:13:32 GMT - View all Regina, SK jobs
          Moovit, Microsoft Partner For Azure Maps Data      Cache   Translate Page      
The Israeli-founded transit data company will integrate its transportation platform into Microsoft's Azure Maps.
          Microsoft AZ-100 Exam Dumps PDF Questions Answers      Cache   Translate Page      
By Smithlee00    In Education    1 hour ago
Microsoft Azure Infrastructure and Deployment, also known as AZ-100 exam, is a Microsoft Certification. Microsoft AZ-100 exams reliable questions answers of braindumpskey is developed in accordance with the latest syllabus. We also regularly upgrade our study materials. So our exam training materials is simulated with the practical exam. So that the passing rate of your exam is very high through braindumpskey site. We can help you to achieve your goals. You just need to get Microsoft Azure Infrastructure and Deployment AZ-100 PDF exams questions answers to do test, you can pass the Microsoft AZ-100 certification exam successfully. For getting AZ-100 exam dumps PDF then visit at: https://www.braindumpskey.com/exam/AZ-100.html
Tags: az-100, dumps, pdf, braindumps, questions, exam, study, guide, microsoft, preparation, material, and, answers, real, question

          Engineer, R & D (Software) - Venture International Pte Ltd - Ang Mo Kio      Cache   Translate Page      
I.e. AWS, Azure or Google Cloud. Web APIs such as google map API etc. Design and develop mobile, PC applications (may include full stack) for use as companion...
From Venture Corporation Limited - Fri, 13 Jul 2018 11:23:30 GMT - View all Ang Mo Kio jobs
          Software Developer - Jobline Resources Pte Ltd - Paya Lebar      Cache   Translate Page      
Familiar with cloud environment – AWS, Azure, Google Cloud, principals of, and execution in, DevOps. The development environment consists of AWS Cloud and...
From Jobline Resources Pte Ltd - Fri, 02 Nov 2018 09:24:59 GMT - View all Paya Lebar jobs
          System Cloud Engineer       Cache   Translate Page      
Namens Layer ben ik per direct op zoek naar een System Cloud Engineer in Amsterdam voor mijn directe eindklant! Deze succesvolle Cloud organisatie heeft op dit moment echt de behoefte aan een specialist met veel kennis van Exchange en Azure die de volledige IaaS-omgeving kan ondersteunen. Je werkzaamheden zijn het ontwikkelen van de technische oplossingen die een meerwaarde bieden voor de activiteiten van de klant...
          Challenges and Solutions for Lumbar Total Disc Replacement Implantation      Cache   Translate Page      
imageLong-term data are now available to support the safety and efficacy of lumbar total disc replacement (TDR). Five-year randomized and controlled trials, meta-analyses, and observational studies support a similar or lower risk of complications with lumbar TDR compared with fusion. The panel concluded that published data on commercially available lumbar TDR devices demonstrate minimal concerns with late-onset complications, and that the risk of adjacent segment degeneration and reoperations can be reduced with lumbar TDR versus fusion. Survey results of surgeon practice experiences supported the evidence, revealing a low rate of complications with TDR. Panelists acknowledged the importance of adhering to selection criteria to help minimize patient complications.
          Service Delivery Manager Automation & Release - Novartis - Hyderabad, Telangana      Cache   Translate Page      
Experienced in private cloud infrastructure (including VMWare), Amazon Web Services and Microsoft Azure. Represent IT to the business (or in the case of central...
From Novartis - Wed, 24 Oct 2018 14:34:47 GMT - View all Hyderabad, Telangana jobs
          Full Stack Developer - Neo-Hire - Hyderabad, Telangana      Cache   Translate Page      
Microsoft Azure, Amazon AWS. A Full Stack developer is responsible for front and back-end web development....
From Indeed - Tue, 06 Nov 2018 10:31:55 GMT - View all Hyderabad, Telangana jobs
          Import flat files to SQL Server on Linux using Azure Data Studio      Cache   Translate Page      

Have you ever wonder how to import flat files to SQL Server on Linux? In this article, I will share my experience loading data to a Linux SQL Server instance using the new Azure Data Studio data import extension from macOS.


          Azure and Windows PowerShell: Using VM Extensions      Cache   Translate Page      

In the third part of his series, Nicolas Prigent describes how to run post-deployment configuration and automation tasks on Azure Virtual Machines. Nicolas explains how to use Azure VM Extensions using the Azure PowerShell module to save time during the provisioning process.


          Atmosera CEO Jon Thomsen, CISO Sean Ventura to Speak at Cloud Expo      Cache   Translate Page      

Presentations to cover simplifying migration to Azure, advantages of a risk-sharing model

(PRWeb November 07, 2018)

Read the full story at https://www.prweb.com/releases/atmosera_ceo_jon_thomsen_ciso_sean_ventura_to_speak_at_cloud_expo/prweb15889325.htm


          纳德拉会面中国开发者,推进微软智能云生态      Cache   Translate Page      

微软CEO纳德拉日前在华与开发者代表会面,进一步推动微软云计算和人工智能的开放生态。11月6日,纳德拉在北京与来自码隆科技、才云科技、开源中国社区和Kyligence等五组开发者代表就人工智能和开源生态等话题了进行了深入探讨。纳德拉表示,中国实体经济存在着巨大的数字化转型机遇,人工智能开发者在其中扮演关键角色。微软希望通过领先业界的云计算和人工智能服务,予力中国开发者和合作伙伴打造属于自己的人工智能应用,释放中国智慧,发现中国机遇。

 

Al开发者代表码隆科技联合创始人黄鼎隆现场展示了码隆科技最新的人工智能商品识别解决方案,并分享了人工智能与中国垂直行业相结合的潜在机会。他表示, Al能极大地助力实体经济转型升级、提升生产力,码隆是微软智能云Azure的忠实用户,希望能借助微软智能云平台Azure的全球优势,拓展国际业务,深入行业为更多企业打造更高效的AI解决方案。

 

云原生开发者代表才云科技创始人张鑫与纳德拉畅聊如何实现人工智能的“最后一公里配送”以及如何将有中国特色的Al应用场景带向全球,通过开放的AI平台赋能企业开发者,实现“人工智能飞入千家万户”。他表示,基于Azure智能云和人工智能技术,才云科技帮助跨国企业客户成功在混合云上打造O2O服务,通过容器化AI技术帮助其迅速处理大量订单。对此,纳德拉表示将充分发挥微软的平台优势,将更多Azure的全球功能和服务带到中国,帮助开发者将中国用户案例推广和应用到海外市场。

 


微软的云计算服务Azure,至今为止已在中国市场运营了四年,这与纳德拉当上微软CEO的时间几乎同步,而在此之前,纳德拉正是Azure的负责人。微软以传统软件巨头的身份,能认识到云计算的价值并奋力获得在该市场的一个显著地位,与纳德拉的推动密不可分。他正确认识到发展云计算的紧迫性和重要性,打破微软传统,在Azure加大对开源数据的支持,全力推动Azure入华等意识与行动,使得云计算成为微软体系内连续数年增长最快的业务。

 

作为首个实现在华商用的国际公有云服务,Azure在华连续四年实现了三位数增长,拥有了超过1300家解决方案合作伙伴和超过10万名活跃开发者。2018年微软在华商用了Azure Stack 混合云解决方案,Azure完成了云计算规模的三倍扩容,还计划在中国同时增加两个Azure区域、两个数据中心,将其全球云覆盖区域扩充到54个,超过全球所有公有云提供商。截至目前,超过 95%的全球“财富 500 强”企业都在采用微软云服务。

 

当然,Azure入华这四年,更为重要的是2016年开始提供的Azure认知服务,也就是国内通俗所称的人工智能服务。这一举措产生的背景是人工智能研发与应用热潮的兴起,各种大大小小的创业公司雨后春笋般涌现,微软不失时机地在Azure中推出认知服务,其中包含语音、视觉、语言、机器翻译等24项功能,以提供API的方式为开发者提供人工智能开发平台。目前Azure认知服务在全球有超过100万开发者,中国开发者超过10万。这一次的微软,在对全新动向的反应速度和行动力方面,令人耳目一新。

 

在人工智能发展大潮下,未来的云计算是否还会是过去的云计算概念,这是个很深刻的主题。微软选择的方向是,将云计算和人工智能结合起来,其价值观体现为“智能云和智能边缘”,即以人工智能云为核心,与周围的智能边缘融会贯通,而在智能云上,将集合微软在人工智能方面的所有技术积累与成果。目前的Azure包括公有云Azure、混合云Azure Stack、物联网Azure IoT Edge,另外还有起安全保护作用的Azure Sphere等部分,可说是覆盖了智能云的所有领域,集成了微软的主要技术能力。

 

此外,日前在上海举办的2018微软技术暨生态大会上,微软宣布Dynamics 365企业应用云平台将于2019年春季落地中国。届时,微软智能云的“三驾马车”,Microsoft Azure云计算平台、Office365云生产力平台、Dynamics 365企业应用云平台将齐聚中国市场,覆盖个人、企业、开发者等所有市场需求。微软认识到中国正处在急剧变化的数字化转型过程中,智能云的需求即将出现井喷,为此微软对中国市场进行了大举投入,与本土运营商密切合作提供服务,意欲为数字化转型提供强大助力。

 


微软所主张的这个智能云是开放的,不但平台开放,努力提供开发工具,甚至还可以支持TensorFlow、CNTK等其他第三方深度学习平台。在2018年的微软开发者大会上,微软还推出了跨平台、开源机器学习框架ML.NET开放预览。ML.NET可以让任何开发者都能开发出自己的定制化机器学习模型,并将其融入到自己的应用中去,而开发者完全无需具备开发和调试机器学习模型的经验。所有这些微软为AI研发应用大幅降低门槛的动作,都显示出微软在人工智能方面抱有的坚定决心。

 

用雄厚的资源和技术实力搭建起平台后,将平台变得越来越开放以吸引开发者,汇聚智力和创新力以进一步提升平台的竞争力,这是所有平台建设过程中的既有路线。开发者是平台的创新动力,而在吸引开发者加入进来这件事上,微软在中国有着其非常雄厚的优势。微软落地中国已有26年之久,微软中国研究院也在今年迎来了20岁生日,长期经营中国市场使其积累下丰富的经验,对中国市场的认知与理解远超其他跨国公司。基于这些状况,微软在中国的投资和拓展都是长期性的,并与中国合作伙伴保持了持久而良好的关系。

 

微软目前在中国拥有170,900家合作伙伴和40万左右的开发者,其中有10万是人工智能平台的开发者,由于人工智能应用落地热潮在中国方兴正艾,预计这个数字未来几年还将快速上涨。在吸引中国开发者方面,微软并没有依仗技术实力而展现高高在上的姿态,而是耐心细致地从头做起,将吸引开发者加入的具体条件都梳理清楚后加以完善。三年前微软推出了在中国的孵化计划,目前为止在全国已设立了至少26家以上的孵化平台,其中有3个是国家级的,4个是省级的众创空间,累计孵化创新企业高达600多家,其中两家已成长为亿级独角兽。

 

在孵化计划之上,微软还引入全球性的加速器项目,帮助有实力和潜质的初创企业快速成长,本次与纳德拉面对面交流的码隆科技、才云科技和Kyligence,就是微软加速器计划的一员。微软在全球有8个加速器,其中两个在中国,而最早的北京加速器在2012年就已启动。在北京和上海的这两个微软加速器,建立以来已累计为231家国内新创企业进行了加速,其中上海加速器的校友企业总估值超过556亿元人民币,北京加速器校友企业总估值超过了854.87亿元人民币。

 

上述所有微软所做的事情,彼此之间都有相关性,最终落实到吸引合作者和开发者这一点上,使得微软在这方面的优势非常突出。正是因为微软看到了中国巨大的数字化转型机遇,希望能够更为深度地参与到中国市场中来,才有了不断在中国加码长期建设,将自身对合作者与开发者的友好度不断提升。与国内发展人工智能的多数公司相比,微软所做的事情并不算快,但足够扎实,其计划和举措并不虚浮,而是扎实有内容,从源头做起用切实的行动扶持国内的创新力量,而非仅许以承诺和保证。未来几年微软将继续深化本地合作,深度参与数字化转型大潮。

 

过去几年,人工智能越来越受到各级政府的重视与支持,在国家战略中的地位不断提升。这是个需要各方通力协作,不断将创新能力引领出来的工作。人工智能的发展不但依赖创新,还急需开发工具和生态平台,在这方面微软有着很强的实力足以成为重要的参与者。微软在云计算和人工智能方面的发展战略,与中国市场的发展走向高度吻合,活跃而有创造力的新创势力,庞大的人口市场红利,深耕中国20多年的微软,不会错过这个重大机会,将尽全力在中国市场开辟一片新的天地。


 
          Architect - Microsoft - Redmond, WA      Cache   Translate Page      
Preferred experience with at least one of the Machine Learning related technologies (SAS, SPSS, RevR, Azure ML, MapR)....
From Microsoft - Thu, 23 Aug 2018 08:25:24 GMT - View all Redmond, WA jobs
          Azure Power Announces Exercise of Option to Purchase Additional Shares      Cache   Translate Page      
...proceeds from the follow-on public offering to approximately $184.3 million after underwriting discounts and commissions but before offering expenses. A registration statement on Form F-3 was previously filed with the U.S. Securities and Exchange Commission (" SEC ( News ...


          EUR/GBP Technical Analysis: Euro bears taking some profits near October lows against the British Pound      Cache   Translate Page      
  • EUR/GBP is trading in a bear trend below its 200-period simple moving average (SMA).
  • EUR/GBP is finding some support near 0.8722 (October low). Bears will try to defend 0.8752 level (October 16 low). The RSI and Stochastic are still below the 50 line suggesting a bearish bias while the MACD had a crossover from below. 
  • The 0.8752 is the level to look at for bears who want to short or for bulls who are looking for a breakout above it.

EUR/GBP 4-hour chart

Main Trend:            Bearish

Resistance 1:              0.8752 October 16 low
Resistance 2:              0.8800 figure
Resistance 3:              0.8847 September 20 low 
Resistance 4:              0.8876 September 11 low

Support 1:              0.8722 October low
Support 3:              0.8700 figure
Support 3:              0.8665 March 22 low


Additional key levels at a glance:

EUR/GBP

Overview:
    Last Price: 0.8742
    Daily change: 24 pips
    Daily change: 0.275%
    Daily Open: 0.8718
Trends:
    Daily SMA20: 0.8811
    Daily SMA50: 0.8871
    Daily SMA100: 0.8886
    Daily SMA200: 0.8837
Levels:
    Daily High: 0.8756
    Daily Low: 0.8714
    Weekly High: 0.8942
    Weekly Low: 0.8757
    Monthly High: 0.8942
    Monthly Low: 0.8722
    Daily Fibonacci 38.2%: 0.873
    Daily Fibonacci 61.8%: 0.874
    Daily Pivot Point S1: 0.8702
    Daily Pivot Point S2: 0.8687
    Daily Pivot Point S3: 0.866
    Daily Pivot Point R1: 0.8745
    Daily Pivot Point R2: 0.8772
    Daily Pivot Point R3: 0.8787

 


          GBP/USD Technical Analysis: How high is too high for Cable bulls?      Cache   Translate Page      
  • GBP/USD is trading in a bull trend above the 200-period simple moving average on the 4-hour chart. 
  • GBP/USD is having a steep spike up to 1.3150. However, the market is overbought as per the RSI and Stochastic indicators suggesting that bulls might have overstretched themselves. 
  • A pullback down to 1.3100 and to 1.3050 might be in the making if bears keep the market below 1.3150.  

GBP/USD 4-hour chart


Main trend:                      Bullish
Short-term:                      Bearish

Resistance 1:                  1.3150 figure
Resistance 2:                  1.3200 figure
Resistance 3:                  1.3259 October 12 high

Support 1:                      1.3100 figure
Support 2:                      1.3043, October 23 high
Support 3:                      1.3000 figure
Support 4:                      1.2947 key resistance
Support 5:                      1.2921 October 4 low
Support 6:                      1.2900 figure
Support 7:                      1.2854 October 29

 

Additional key levels at a glance:

GBP/USD

Overview:
    Last Price: 1.3138
    Daily change: 41 pips
    Daily change: 0.313%
    Daily Open: 1.3097
Trends:
    Daily SMA20: 1.2998
    Daily SMA50: 1.3029
    Daily SMA100: 1.3039
    Daily SMA200: 1.3419
Levels:
    Daily High: 1.31
    Daily Low: 1.3021
    Weekly High: 1.3042
    Weekly Low: 1.2696
    Monthly High: 1.326
    Monthly Low: 1.2696
    Daily Fibonacci 38.2%: 1.307
    Daily Fibonacci 61.8%: 1.3051
    Daily Pivot Point S1: 1.3045
    Daily Pivot Point S2: 1.2993
    Daily Pivot Point S3: 1.2966
    Daily Pivot Point R1: 1.3125
    Daily Pivot Point R2: 1.3152
    Daily Pivot Point R3: 1.3204

 


          EUR/USD Technical Analysis: 1.1500 brick wall can send Fiber to 1.1430 in the New York session      Cache   Translate Page      
  • EUR/USD is trading in a bear trend below the 200-period simple moving average on the 4-hour chart.
  • EUR/USD bulls got a strong boost on the back of the US mid-term elections as buyers are challenging the 1.1500 figure. The momentum has switched to bullish with the RSI and MACD indicators in positive territories. 
  • However, the Stochastic indicator is in the overbought zone suggesting a potential pullback down in the New York session with 1.1456 and 1.1430 as targets.

EUR/USD 4-hour chart

Main trend:             Bearish

Resistance 1:   1.1500 figure and October 2 swing low 
Resistance 2:   1.1530 August 23 swing low (key level)
Resistance 3:   1.1600 figure

Support 1:   1.1456 November 5 high
Support 2:   1.1430 October 9 low
Support 3:   1.1400 figure
Support 4:   1.1350 figure
Support 5:   1.1300 current 2018 low


Additional key levels at a glance:

EUR/USD

Overview:
    Last Price: 1.1487
    Daily change: 69 pips
    Daily change: 0.604%
    Daily Open: 1.1418
Trends:
    Daily SMA20: 1.1453
    Daily SMA50: 1.1557
    Daily SMA100: 1.1586
    Daily SMA200: 1.1857
Levels:
    Daily High: 1.1438
    Daily Low: 1.1392
    Weekly High: 1.1456
    Weekly Low: 1.1302
    Monthly High: 1.1625
    Monthly Low: 1.1302
    Daily Fibonacci 38.2%: 1.142
    Daily Fibonacci 61.8%: 1.1409
    Daily Pivot Point S1: 1.1394
    Daily Pivot Point S2: 1.1369
    Daily Pivot Point S3: 1.1347
    Daily Pivot Point R1: 1.144
    Daily Pivot Point R2: 1.1462
    Daily Pivot Point R3: 1.1487

 


          Poland NBP Base rate meets forecasts (1.5%)      Cache   Translate Page      

          USD/CAD Technical Analysis: Flirting with an important confluence support ahead of Trump's news conference      Cache   Translate Page      

   •  The pair extended its intraday sharp retracement slide from four-day tops and has now moved on the verge of breaking below an important confluence support.

   •  The mentioned support comprises of the 100-day SMA, near six-week-old ascending trend-line and 23.6% Fibonacci retracement level of the 1.2782-1.3170 upsurge. 

   •  Meanwhille, technical indicators on hourly charts have been continuously gaining negative momentum, albeit have managed to hold in positive territory on the daily chart.

   •  Hence, it would be prudent to wait for a sustained weakness below the said confluence support before confirming that the pair might have already topped out in the near-term.
 

USD/CAD daily chart

USD/CAD

Overview:
    Last Price: 1.3081
    Daily change: -45 pips
    Daily change: -0.343%
    Daily Open: 1.3126
Trends:
    Daily SMA20: 1.3075
    Daily SMA50: 1.3024
    Daily SMA100: 1.3076
    Daily SMA200: 1.2934
Levels:
    Daily High: 1.3146
    Daily Low: 1.3096
    Weekly High: 1.3172
    Weekly Low: 1.3048
    Monthly High: 1.3172
    Monthly Low: 1.2783
    Daily Fibonacci 38.2%: 1.3126
    Daily Fibonacci 61.8%: 1.3115
    Daily Pivot Point S1: 1.3099
    Daily Pivot Point S2: 1.3072
    Daily Pivot Point S3: 1.3049
    Daily Pivot Point R1: 1.3149
    Daily Pivot Point R2: 1.3172
    Daily Pivot Point R3: 1.3199

 


          DXY Technical Analysis: Further downside should meet the 55-day SMA at 95.40      Cache   Translate Page      
  • The greenback accelerated the weekly leg lower following the results from the US mid-term elections and is now testing multi-day lows in the 95.85/80 band.
  • DXY broke below the critical support at 96.00 the figure and in doing so has opened the door for a potential drop and test of the 95.40 region, where coincide the 55-day SMA and the top of the daily cloud.
  • However, the constructive bias should remain unchanged while above the 93.71 level, July’s low.

DXY daily chart

 

 

 

 

 

 

 

 

 

 

 

Dollar Index Spot

Overview:
    Last Price: 95.8
    Daily change: -43 pips
    Daily change: -0.447%
    Daily Open: 96.23
Trends:
    Daily SMA20: 96.04
    Daily SMA50: 95.41
    Daily SMA100: 95.21
    Daily SMA200: 93.24
Levels:
    Daily High: 96.45
    Daily Low: 96.14
    Weekly High: 97.2
    Weekly Low: 95.99
    Monthly High: 97.2
    Monthly Low: 94.79
    Daily Fibonacci 38.2%: 96.26
    Daily Fibonacci 61.8%: 96.33
    Daily Pivot Point S1: 96.1
    Daily Pivot Point S2: 95.96
    Daily Pivot Point S3: 95.79
    Daily Pivot Point R1: 96.41
    Daily Pivot Point R2: 96.58
    Daily Pivot Point R3: 96.72

 


          United States MBA Mortgage Applications declined to -4% in November 2 from previous -2.5%      Cache   Translate Page      

          Комментарий к записи Настройка клиентов WSUS с помощью групповых политик (Yurij)      Cache   Translate Page      
Спасибо за Ваш ответ. Мне это нужно для уменьшения трафика между сервером и клиентом. Инфраструктура поднята в Azure и там трафик по VPN платный. Вот и хочу попробовать чтобы обновы тянулись из интернета, а не по VPN между клиентом и сервером. И не обязательно ставить только в ручном режиме. Еще одно уточнение. У меня обновления ставятся из SCCM и похоже я не могу сделать так как Вы предложили. Возможно у Вас есть еще мысли как можно это сделать?
          EUR/JPY Technical Analysis: The cross keeps the rally well and sound and is now targets the 200-day SMA at 130.25 and above      Cache   Translate Page      
  • EUR/JPY is advancing for the fifth consecutive session on Wednesday on the back of the upbeat momentum surrounding the European currency.
  • Immediate target of the up move remains at the 200-day SMA at 130.25 ahead of late-August peaks around 130.90.
  • However, the cross needs to break above the resistance line off YTD tops in order to extend the move higher, today at 132.33.

EUR/JPY daily chart

 

 

 

 

 

 

 

 

 

 

 

 

 

EUR/JPY

Overview:
    Last Price: 129.94
    Daily change: 39 pips
    Daily change: 0.301%
    Daily Open: 129.55
Trends:
    Daily SMA20: 128.89
    Daily SMA50: 129.93
    Daily SMA100: 129.47
    Daily SMA200: 130.32
Levels:
    Daily High: 129.57
    Daily Low: 128.84
    Weekly High: 129.34
    Weekly Low: 127.24
    Monthly High: 132.49
    Monthly Low: 126.63
    Daily Fibonacci 38.2%: 129.3
    Daily Fibonacci 61.8%: 129.12
    Daily Pivot Point S1: 129.07
    Daily Pivot Point S2: 128.59
    Daily Pivot Point S3: 128.34
    Daily Pivot Point R1: 129.8
    Daily Pivot Point R2: 130.05
    Daily Pivot Point R3: 130.53

 


          Chile Trade Balance dipped from previous $-122M to $-217M in October      Cache   Translate Page      

          USD/JPY Technical Analysis: Finds a temporary support near 200-hour EMA, remains vulnerable      Cache   Translate Page      

   •  The pair extended its intraday sharp retracement slide from fresh one-month tops and momentarily slipped back below the 113.00 handle in the last hour.

   •  Weakness break below 50/100-hour SMA was seen as a key trigger for bearish traders, though the downfall seems to have found some support near 200-hour EMA.

   •  Technical indicators on the 1-hourly chart have been gaining negative momentum and point to an extension of the intraday steep decline led by the US midterm election results.

   •  A decisive break through the mentioned support and a subsequent fall below 38.2% Fibonacci level of the 111.38-113.82 recent upsurge, will add credence to the bearish outlook.

USD/JPY 1-hourly chart

USD/JPY

Overview:
    Last Price: 113.07
    Daily change: -39 pips
    Daily change: -0.344%
    Daily Open: 113.46
Trends:
    Daily SMA20: 112.55
    Daily SMA50: 112.43
    Daily SMA100: 111.76
    Daily SMA200: 109.97
Levels:
    Daily High: 113.51
    Daily Low: 113.1
    Weekly High: 113.4
    Weekly Low: 111.78
    Monthly High: 114.56
    Monthly Low: 111.38
    Daily Fibonacci 38.2%: 113.35
    Daily Fibonacci 61.8%: 113.25
    Daily Pivot Point S1: 113.2
    Daily Pivot Point S2: 112.94
    Daily Pivot Point S3: 112.79
    Daily Pivot Point R1: 113.62
    Daily Pivot Point R2: 113.77
    Daily Pivot Point R3: 114.03

 


          Portugal Unemployment Rate unchanged at 6.7% in 3Q      Cache   Translate Page      

          Brazil IPCA Inflation came in at 0.45%, below expectations (0.55%) in October      Cache   Translate Page      

          Day in the Sun - By Evan Hammonds      Cache   Translate Page      

Most owners, trainers, and racing fans were tricked by the Halloween deluge in Louisville, Ky., that dumped several inches of rain on the main track and turf course at Churchill Downs. However, the sport sure got a treat once the horses hit the track for the Nov. 2-3 Breeders’ Cup.

The main track dried out under a wind-whipped autumn day, and the turf course was optimistically labeled “good” for the World Championships opener. Big-day Saturday dawned beautifully, offering high-end racing under azure skies, with a turf course that appeared to favor the European runners…that, or the European contingent this year was just that strong.

Juddmonte Farms did the most sporting of gestures, bringing Europe’s best runner to the U.S. to tackle the $4 million Breeders’ Cup Turf (G1T). The 4-year-old filly Enable, the horse most everyone wanted to see, delivered in every way, pulling off a dramatic win while running closer to the stands side than the hedge in the stretch.

In delivering the first Prix de l’Arc de Triomphe/Breeders’ Cup Turf double dip, she squashed the “Curse of Dancing Brave.” Juddmonte’s Dancing Brave was the first to attempt the transatlantic double at Santa Anita Park in 1986, but had melted in the Southern California sun, finishing fourth at 1-2.

Few have embraced the Breeders’ Cup concept more than Juddmonte owner Prince Khalid Abdullah. A strong supporter of the series from the beginning—his Alphabatim ran fifth in the inaugural Turf in 1984—Juddmonte’s pair of scores Nov. 3 (homebred Expert Eye landed the Mile, G1T, as well) brought Juddmonte’s win total to six as breeder and seven overall.

That is in sharp contrast to Mike Abraham, the breeder of Accelerate, winner of the day’s biggest prize, the Breeders’ Cup Classic (G1).

A month before the main event, BloodHorse’s racing editor Alicia Wincze Hughes’ story on Abraham’s breeding of Accelerate on Bloodhorse.com told a different tale from the multi-generational grooming of families at Juddmonte.

At the 2011 Keeneland November sale Abraham ran across Issues, a daughter of Awesome Again, selling in foal to a rising stallion named Scat Daddy.

“I like Awesome Again mares, and she was young at the time and in foal to Scat Daddy, who I liked but who hadn’t really hit yet,” Abraham told Wincze Hughes. “To be honest with you, I hadn’t even looked at her, but I was looking at the pedigrees the night before, and when she went through the ring, I thought, if she doesn’t bring too much, I’m going to buy her. I didn’t even really look at her. I was up in the pavilion, but the price was certainly right. And the rest is certainly history.”

He paid $25,000 for her.

The following year he attempted to return Issues to Scat Daddy, but the Ashford Stud team wasn’t sure she’d be able to get into his book, so Abraham scoured the Ashford line up and landed on second-year stallion Lookin At Lucky.

A straight line isn’t always the path to glory.

The rest of the story can be found in Wincze Hughes’ Classic recap.

Fast Friday?

Let’s not read too much into the fact overall handle was down slightly at this year’s Breeders’ Cup compared to 2017. While Enable was a true draw to Thoroughbred diehards, the Classic lacked the Gun Runner vs. Arrogate angle from a year ago. The retirement of the Triple Crown winner Justify didn’t help the gate or handle either.

The “Future Stars Friday” format was new for 2018, adding a new race—the Breeders’ Cup Juvenile Turf Sprint—to the mix. With an additional race, handle was off slightly from a year ago.

Juvenile form, especially with three races over a waterlogged turf course, might not be what hard-core bettors are willing to sink their teeth into.

Breeders’ Cup will likely give this format time to grow, but a suggestion might be to add some older-horse races to the Friday program. How about a “Fast and Furious Friday” that features the sprint-centric races of the Breeders’ Cup? The Sprint (G1), Filly & Mare Sprint (G1), Turf Sprint (G1T), Juvenile Turf Sprint, and Dirt Mile (G1) would make for a quick pick.


          Software Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
The Azure Storage Garbage Collection defragments and reclaims unused data blocks within storage system at exabyte scale with very efficient resource consumption...
From Microsoft - Tue, 06 Nov 2018 14:40:57 GMT - View all Redmond, WA jobs
          Hardware Engineer II - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 25 Oct 2018 05:51:24 GMT - View all Redmond, WA jobs
          A Fujitsu PRIMEFLEX for Microsoft Azure Stack megoldása megkönnyíti a felhő bevezetését      Cache   Translate Page      
A PRIMEFLEX for Microsoft Azure Stack egyedülálló megoldás azoknak a vállalatoknak, akik szeretnék kiaknázni a nyilvános felhő előnyeit, de egyes kritikus IT-folyamatokat szívesebben tartanának házon belül az adatszuverenitás, a megfelelés és a csatlakozási követelmények teljesítése érdekében.
          A Fujitsu és a Microsoft felgyorsítják a kritikus feladatok áthelyezését a Microsoft Azure felhőbe      Cache   Translate Page      
A Fujitsu és a Microsoft új, globális rendszerintegrátori partnerkapcsolata felgyorsítja a Microsoft Azure bevezetését a szervezeteknél.
          Microsoft and Wal-Mart Partner to Take on Amazon      Cache   Translate Page      
Quite interesting details in the complete article linked to below.   Microsoft has the cloud based technical capabilities.  Also the IOT architecture that will be important.    But is the advanced tech that Amazon has installed already well ahead?  The partnership makes sense to test that.

Microsoft and Wal-mart are creating a ‘cloud factory’ to take on Amazon   By Mike Wheatley in SiliconAngle

Microsoft Corp. and Wal-Mart Stores Inc. are building on a strategic partnership announced in July that saw them commit to using the Redmond software giant’s cloud, artificial intelligence and “internet of things” tools to modernize the retailer’s business operations.

Microsoft and Walmart today said they’ve created a new “cloud factory” at the latter’s existing Innovation Hub (pictured) in Austin, Texas. Known as “4.co” due to its location on the corner of Fourth and Colorado streets, the joint engineering facility is set to open early next year and will be staffed by a team of 30 technologists from both companies.

One of the goals at the facility will be to help Walmart move thousands of its internal business applications over to Microsoft’s Azure cloud platform. The engineers will also work together to develop brand-new, cloud-native applications. In order to do so, the companies will make use of Microsoft’s cognitive services, chatbot and machine learning tools, Clay Johnson, executive vice president and enterprise chief information officer at Walmart, said in an interview with Microsoft Transform.  .... "


          Debugging Node.js on Kubernetes with Linkerd      Cache   Translate Page      

A few weeks ago, at Node+JS Interactive, our friend Brian Redmond of Microsoft Azure gave an excellent talk that used Linkerd 2.0 to identify the source of failures in a Node app. In this talk, Brian explained why debugging Node…

The post Debugging Node.js on Kubernetes with Linkerd appeared first on Buoyant.


          Cloud Workspace Management Suite Now Available on Microsoft Azure Marketplace      Cache   Translate Page      
CloudJumper , a leading provider of Virtual Desktop Infrastructure (VDI), Workspace as a Service (WaaS) and Desktop as a Service (DaaS) solutions,... Read more at VMblog.com.
          (USA-CA-San Jose) Sr Data Architect – Big Data      Cache   Translate Page      
**Danaher Company Description** Danaher is a global science & technology innovator committed to helping our customers solve complex challenges and improve quality of life worldwide. Our world class brands are leaders in some of the most demanding and attractive industries, including life sciences, medical diagnostics, dental, environmental and applied solutions. Our globally diverse team of 67,000 associates is united by a common culture and operating system, the Danaher Business System, which serves as our competitive advantage. We generated $18.3B in revenue last year. We are ranked #162 on the Fortune 500 and our stock has outperformed the S&P 500 by more than 1,200% over 20 years. At Danaher, you can build a career in a way no other company can duplicate. Our brands allow us to offer dynamic careers across multiple industries. We’re innovative, fast-paced, results-oriented, and we win. We need talented people to keep winning. Here you’ll learn how DBS is used to shape strategy, focus execution, align our people, and create value for customers and shareholders. Come join our winning team. **Description** *Danaher Digital* Danaher Digital is our digital innovation and acceleration center where we’re bringing together the leading strategic product and business leaders, technologists and data scientists for the common purpose of accelerating development and commercialization of disruptive and transformative digital solutions into the marketplace. We accelerate Danaher’s digital innovation journey by partnering with Danaher operating companies (OpCos) to monetize and commercialize the potential of emerging digital trends. Located in Silicon Valley, the heart of global innovation, Danaher Digital is ideally situated to capitalize on the digital mega trends transforming our world, including Internet-of Things (IoT), Data, AI, cloud, mobile, Augmented Reality (AR), Blockchain and other Digital frontiers. *Senior Data Architect* You will report to the Senior Director of Data & Analytics and will be responsible for leading the vision, design, development and deployment of large-scale data fabrics and data platforms for Danaher’s IoT and Analytics Machine Learning solutions. The right candidate will provide strategic and technical leadership in using best-of-breed big data technologies with the objective of bringing to market diverse solutions in IoT and Analytics for health sciences, medical diagnostics, industrial and other markets. This person will use his/her Agile experience to work collaboratively with other Product Managers/Owners in geographically distributed teams. *Responsibilities*: * Lead analysis, architecture, design and development of large-scale data fabrics and data platforms Advanced Analytics and IoT solutions based on best-of-breed and contemporary big-data technologies * Provide strategic leadership in evaluation, selection and/or architecture of modern data stacks supporting diverse solutions including Advanced analytics for health sciences, medical diagnostics, industrial and other markets * Provide technical leadership and delivery in all phases of a solution design from a data perspective: discovery, planning, implementation and data operations * Manage the full life-cycle of data - ingestion, aggregation, storage, access and security - for IoT and advanced analytics solutions * Work collaboratively with Product Management and Product Owners from other business units and/or customers to translate business requirements in to technical requirements (Epics and stories) in Agile process to drive data architecture * Own and drive contemporary data architecture and technology vision/road map while keeping abreast of the technology advances and architectural best practices **Qualification** *Required Skills & Experience: * * Bachelor’s degree in Computer Science or related field of study * Experience with security configurations of the Data Stack * 7 years’ hands-on leader in designing, building and running successful full stack Big Data, Data Fabric and/or Platforms and IoT/Analytics solutions in production environments * Must have experience in major big data solutions like Hadoop, HBase, Sqoop, Hive, Spark etc. * Architected, developed and deployed data solutions on one or more of: AWS, Azure or Google IaaS/PaaS services for data, analytics and visualization * Demonstrated experience of full IoT and Advanced Analytics data technology stack, including data capture, ingestion, storage, analytics and visualization * Has worked with customers as a trusted advisor in Data architecture and management. * Working experience with ETL tools, storage infrastructure, streaming and batch data, data quality tools, data modeling tools, data integration and data visualization tools * Must have experience in leading projects in Agile development methodologies * Provide mentorship and thought leadership for immediate and external(customer) teams in best practices of Data platform; Lead conversations around extracting value out of Data Platform * Travel up to 40% required **Danaher Corporation Overview** Danaher is a global science & technology innovator committed to helping our customers solve complex challenges and improve quality of life worldwide. Our world class brands are leaders in some of the most demanding and attractive industries, including life sciences, medical diagnostics, dental, environmental and applied solutions. Our globally diverse team of 67,000 associates is united by a common culture and operating system, the Danaher Business System, which serves as our competitive advantage. We generated $18.3B in revenue last year. We are ranked #162 on the Fortune 500 and our stock has outperformed the S&P 500 by more than 1,200% over 20 years. At Danaher, you can build a career in a way no other company can duplicate. Our brands allow us to offer dynamic careers across multiple industries. We’re innovative, fast-paced, results-oriented, and we win. We need talented people to keep winning. Here you’ll learn how DBS is used to shape strategy, focus execution, align our people, and create value for customers and shareholders. Come join our winning team. **Organization:** Corporate **Job Function:** Information Technology **Primary Location:** North America-North America-United States-CA-San Jose **Schedule:** Full-time **Req ID:** COR001259
          (IT) Senior Consultant - Microsoft Office 365 and Azure - Migration      Cache   Translate Page      

Rate: 120k - 150k Annual AUD   Location: Melbourne   

Our client is a global Powerhouse consulting organisation who has revenues in excess of $7bn globally. They are on a huge growth trajectory in Australia and continually acquiring new clients and projects. Our client is looking for a Senior Consultant to join their Melbourne practice on a permanent basis specialised in the implementation of Microsoft cloud solutions (Office 365 and Azure). This is a hands on role helping large enterprise customers to migrate to Office 365 on Azure cloud and all the peripheral project work and technology associated including data migrations. To be eligible for the role you will have experience including: Minimum of 5 years of Technical Consulting experience Minimum 3 years of experience implementing Microsoft Cloud products into large enterprise customers Extensive experience in the implementation of Microsoft Office 365 AND Microsoft Azure Strong background migrating large enterprise data to cloud Microsoft Certification is highly advantageous Strong customer service and consulting skills Excellent communication and interpersonal skills Applicants must be able to demonstrate stability in their employment history - our client has a strong focus on organisational culture and team stability. Our client offers the successful candidate a leading remuneration package along with extensive career development, training and support. Our client regularly provides additional certification training as well as both external and internal training programs. The role will commence in January 2019 however the client is interviewing immediately. To be a part of a cutting edge technical team and awesome organisational culture, apply now using the links below. Apply now using the links below. Your CV will not be presented to any clients without your written approval to do so. BT People is a consulting and recruitment organisation specialised in systems led business transformation projects. BT People is an APSCo Certified recruitment organisation.
 
Rate: 120k - 150k Annual AUD
Type: Full Time
Location: Melbourne
Country: Australia
Contact: BTPeople
Advertiser: BTPeople
Reference: JS2843156/567360363

          (IT) React Developer      Cache   Translate Page      

Rate: 100k - 125k Annual AUD   Location: Melbourne   

As a talented Front End Developer with proven experience working with ReactJS, you will get the opportunity to support an exciting range of well known organisations in achieving their targets. Company: Global Presence | Leading consultancy | Enviable Reputation Newest technology for cutting edge applications Long term, Tier 1 client relationships Responsibilities: Support multiple large scale businesses Providing input into solving complex problems Developing excellent software to achieve quality, budget and schedule outcomes Develop code based upon client requirements and according to best practise Identify new components and impact analysis of existing systems Develop web-based user interfaces Provide Post Implementation Support Skills: Savvy and intelligent with a real passion for technology Exceptional knowledge of ReactJS with Webpack , GruntJS, JSPM, BowerJS and NPM Experience with Rapid prototyping of application concepts Exposure to Java, Ruby, SQL and NoSQL databases Karma, Mocha, Chai, Jasmine, Protractor/Webdriver, Jest AWS, Azure or Google Cloud experience
 
Rate: 100k - 125k Annual AUD
Type: Full Time
Location: Melbourne
Country: Australia
Contact: S2M
Advertiser: S2M
Reference: JS2832683/567360144

          (IT) Dynamics 365 CRM Technical, Senior Consultant & Architect      Cache   Translate Page      

Rate: 100k - 150k Annual AUD   Location: Melbourne   

Due to continuing and expanding project scope across Melbourne our leading IT global Powerhouse for digital & ERP services, is seeking 3 new permanent employees for career driven opportunities. The 3 new roles as described below; please confirm which one suits yous skills & experiences. Microsoft 365 CRM Technical Analyst 4+ years of relevant experience in MS Dynamics CRM Online project implementation, inc. configuration, workflow and plugin development using C#. Experience with multiple versions of Dynamics CRM Tertiary Qualification in IT/Computer Science/related discipline. Excellent customer interfacing skills. Excellent written and verbal communication skills. Microsoft 365 CRM Senior Functional Consultant 6+ years of hands-on exp in Dynamics 365 CRM as a Functional Lead and Power BI as an additional. Expertise in defining Functional Specifications & large application development projects experience. Agile development methodology exposure. Tertiary Qualification in IT/Computer Science/related discipline. Excellent written and verbal communication skills. Technology Architect (MS CRM, API, .NET, Integrations) Planning and ensuring quality and timeliness of activities related to requirement gathering, architecture, design, build and implementation of work product, provide regular guidance to project teams on complex coding, issue resolution and execution are met. Participating in requirements elicitation, analysis of architecture, create and review design and contribute to activities related to estimation, solution and risk planning. 5+ years of relevant experience as a Technical Architect. 10+ years of relevant experience in MS CRM/.NET technologies Excellent knowledge of Microsoft Dynamics CRM, customisation/configuration. Good knowledge of software development, APIs and integration Hands-on experience with Dynamics CRM Online platforms Ability to lead a team of CRM developers Solid experience working on Dynamics CRM Online solutions Knowledge and understanding of Azure cloud Australian work rights are required for this role.
 
Rate: 100k - 150k Annual AUD
Type: Unspecified
Location: Melbourne
Country: Australia
Contact: BTPeople
Advertiser: BTPeople
Reference: JS2832615/567360059

          (USA-CA-Sunnyvale) Staff Systems Engineer, Linux Server Platform      Cache   Translate Page      
LinkedIn was built to help professionals achieve more in their careers, and every day millions of people use our products to make connections, discover opportunities and gain insights. Our global reach means we get to make a direct impact on the world’s workforce in ways no other company can. We’re much more than a digital resume – we transform lives through innovative products and technology. Searching for your dream job? At LinkedIn, we strive to help our employees find passion and purpose. Join us in changing the way the world works. Responsibilities Participation in systems engineering duties for the Cloud Platforms Engineering Team, including management of existing and deploying new Linux (RHEL) Server systems, provide overall support and automation of the Linux server platform and related LinuxBased platform Services, including troubleshooting issues, defining disaster recovery plans, establish procedures and documentation. Build and maintain Red Hat Satellite infrastructure as well as automation around Linux host configuration and software package creation for various applications. Develop automation, mostly in Python, for systems administration, deployment and configuration of Linux servers and developer desktops. Act as a Staff resource to lead the configuration and lifecycle of the Linux Server OS environment including automation and customization. Key contributor on a DevOps-oriented team to facilitate the provisioning of Customer Application Servers and continuous lifecycle ofLinux Server-based applications running in a Hybrid Cloud infrastructure. Responsible for designing, implementing and automating our build, release, deploy, monitoring and configuration processes. Work collaboratively with peers in the team, and perform cloud and cross-platform interoperability tasks. Participate in a 12x7 on-call engineer rotation supporting our core services and handle escalations from Operations Team. Will handle escalations from the US and India Operations Teams. Participate in Tier 3 escalation issues and on-call rotation. Work with business units to translate needs into technical requirements to design, implement, and support applications utilizing recommended best practices. Basic Qualifications -B. . /B. . in a technical field, or equivalent practical experience. -5+ years in IT with Linux experience , specifically related to management of Linux Server systems hosted in a virtual environment -3+ years maintaining Red Hat Satellite or other package/systems management infrastructure -3+ years developing scripts in Python and other automation technologies -Experience supporting Linux server systems in a heterogenous environment with Windows, Mac and Linux clients -Experience creating and maintaining gold OS images for deployment in a virtual environment -Experience securing Linux OS, Satellite Servers, and Linux Applications/Services -Experience working with global teams across multiple time zones -Experience in mentoring and guiding peers in technical skills and development -Experience participating in code review Preferred Qualifications -3+ Years in a Devops role with experience in configuration management tools like Ansible, Puppet, Chef or Salt , experience in automation software like Jenkins and building various stage of the CI/CD pipeline including test automation. -8+ years in IT with Linux experience, 5+ years specifically in management of Linux Server systems hosted on Azure -Experience configuring and troubleshooting n-tier applications utilizing a least privilege model. -Experience with cloud and hybrid compute architecture including VMWare ESXi, Microsoft Azure. -Experience participating in IT compliance audits (PCI, SOX, GDPR, etc) -Experience in scripting with Bash, Python and use/creation of APIs -Familiarity with Apache-based services such as Apache ATS, Tomcat as well as creating services using frameworks like Flask.
          Do unit and post-deployment social support influence the association between deployment sexual trauma and suicidal ideation? - Monteith LL, Hoffmire CA, Holliday R, Park CL, Mazure CM, Hoff RA.       Cache   Translate Page      
Deployment sexual trauma is associated with post-deployment suicidal ideation. No studies have examined the role of social support in this association. The present study examined if perceived unit support and post-deployment support influenced the associat...
          Vente Programme neuf 87 m2 - CANNES (06)      Cache   Translate Page      
- L&#039;exclusivite de sublimes appartements avec terrasses plein ciel a Cannes... L&#039;immeuble affiche fierement sa lignes pure et moderne tout en prolongeant les espaces entre interieur et exterieur. L&#039;architecture mediterraneennes et minimalistes puise son inspiration dans l&#039;esprit californien et balinais, melant harmonieusement les nuances de bois et d&#039;enduit clair dans de subtils jeux d&#039;ombre et de lumiere. Entierement securisee, cette copropriete intimiste de grand standing est un veritable eden. Ouverte sur un agreable jardin paysager, agremente de pins distillant leurs essences riches et variees, de palmiers avec leur ombre tant appreciee en ete et d&#039;oliviers qui apportent leur noblesse boisee. Au c?ur de ce decor grandiose, une piscine aux reflets azures, exposee plein Sud avec son solarium en bois exotique, invite a se ressourcer. Le raffinement et la qualite des prestations se prolongent jusque dans les espaces communs avec la presence de patios vegetalises rappelant les jeux subtils d&#039;ombre et de lumiere. ?Horizon Bay? dispose egalement d&#039;un parking en sous-sol, de places reservees aux voitures electriques et de locaux a velos securises pour un quotidien plus serein.   Le bien est soumis au statut de la copropriete. Nombre de lots de la Copropriete : 23 Votre agent commercial 3G immo sur place, immatricule au RSAC de ST PIERRE DE LA REUNION sous le N° 809 573 009 : Sandrine MARNET, Tel: 06 92 41 19 75 - Prix : 846000 euros
          Projekt-Nr. 53377 - DevOps Engineer (m/f)      Cache   Translate Page      
Aktuell sind wir auf der Suche nach einem DevOps Engineer (m/f).

Ihre Aufgaben:
+ Embrace our customers platforms and product crews to continuously deploy software increments to global markets by designing a powerful build and deployment pipeline for mobile and web applications (React Native)
+ Setup and maintain our different environments (AWS, Azure, Kubernetes)
+ Improve our CI/CD pipeline by further automatizing build and deployment steps.

Anforderungen:
Must:
+ At least 4 years experience in software product development and/or cloud/SaaS software engineering
+ Deep experience with container orchestration (Kubernetes) and complex build automation (Jenkins)
+ Experience with Sonarcube
+ AWS, Azure, Kubernetes
+ Experience with test driven design
+ Very good English skills
+ OS Know-how

Nice:
+ Experience with infrastructure automation with puppet/chef/ansible/terraform
+ Sentry experience
+ German skills

Zusätzliche Informationen:
Konnten wir Ihr Interesse wecken? Dann freuen wir uns auf die Zusendung Ihres aussagekräftigen Qualifikationsprofils unter Angabe Ihrer Stundensatzvorstellung.

Projekt-Nr.:
53377

Stellentyp:
freiberuflich

Einsatzort:
D6

Start:
asap

Dauer:
Mitte 2019
          Azure Sales and Marketing - Ingram Micro Cloud - Bellevue, WA      Cache   Translate Page      
If so, join the Ingram Micro Cloud team - where rainmakers thrive. Are you an innovative, self-starting Marketing guru who loves helping IT providers design and...
From Ingram Micro Cloud - Fri, 28 Sep 2018 07:14:09 GMT - View all Bellevue, WA jobs
          Dawson & Walsh: .NET Developer - Software House - Norwich      Cache   Translate Page      
Dawson & Walsh: .NET Developer (ASP.NET, C#, C#.NET, dot NET, Web Application Development, NET 4.5, ASP.NET MVC 5, MongoDB, RavenDB, Scrum/Agile, IOC, Azure, SQL Server 2016, HTML5, CSS3, Bootstrap, Angular, Urgent) Do you want to work for an internationally recognised b Norwich
          Dawson & Walsh: .NET Developer - Software House - Norwich      Cache   Translate Page      
Dawson & Walsh: .NET Developer (ASP.NET, C#, C#.NET, dot NET, Web Application Development, NET 4.5, ASP.NET MVC 5, MongoDB, RavenDB, Scrum/Agile, IOC, Azure, SQL Server 2016, HTML5, CSS3, Bootstrap, Angular, Urgent) Do you want to work for an internationally recognised b Norwich
          Reducing Azure Functions Cold Start Time      Cache   Translate Page      
You can host a serverless function in Azure in two different modes: Consumption plan and Azure App Service plan. The Consumption plan automatically allocates compute power when your code is running. Your app is scaled out when needed to handle load, and scaled down when code is not running. You don’t have to pay for ... Read moreReducing Azure Functions Cold Start Time
          Interoperabilidad de libro digital a Nasdaq Financial Framework a través de Microsoft Azure Blockchain      Cache   Translate Page      
Por: Matthew Kerner, gerente general de Microsoft Azure Nasdaq revolucionó los mercados financieros en 1971 al abrirlos a millones de inversionistas individuales con el primer intercambio de acciones electrónico del mundo. desde entonces, el espíritu innovador de la compañía los ha mantenido al frente de la evolución tecnológica en los mercados de valores, a la...
          Fostering Evidence-Based Practice to Improve Nurse and Cost Outcomes in a Community Health Setting: A Pilot Test of the Advancing Research and Clinical Practice Through Close Collaboration Model      Cache   Translate Page      
imageAlthough evidence-based practice (EBP) improves health care quality, decreases costs, and empowers nurses, there is a paucity of intervention studies designed to test models of how to enhance nurses' use of EBP. Therefore, the specific aim of this study was to determine the preliminary effects of implementing the Advancing Research and Clinical practice through close Collaboration (ARCC) model on nurses' EBP beliefs, EBP implementation behaviors, group cohesion, productivity, job satisfaction, and attrition/turnover rates. A 2-group randomized controlled pilot trial was used with 46 nurses from the Visiting Nurse Service of New York. The ARCC group versus an attention control group had stronger EBP beliefs, higher EBP implementation behaviors, more group cohesion, and less attrition/turnover. Implementation of the ARCC model in health care systems may be a promising strategy for enhancing EBP and improving nurse and cost outcomes.
          Δεδομένα δημόσιας συγκοινωνίας για τους χάρτες Microsoft Azure από το Moovit      Cache   Translate Page      
Στους χάρτες Azure ενσωματώνει το Moovit τις πληροφορίες δημόσιας συγκοινωνίας, στο πλαίσιο της συνεργασίας του με τη Microsoft.
          Full Practice Authority for Nurse Practitioners      Cache   Translate Page      
imageImplementation of the Affordable Care Act (2010) enabled more than 30 million people to have new access to primary care services. On the basis of current utilization patterns, demand for primary care providers is expected to grow more rapidly than physician supply. This imbalance is expected to worsen, as the aging population requires more health care resources. In addition, more patients are requiring critical care services and physician numbers are not keeping with this growing need. Restrictions on resident physician practice hours have impacted inpatient care as well. Revisiting outdated state practice laws, and considering Full Practice Authority (FPA) for nurse practitioners (NP), is needed for improving access to care while creating greater flexibility for development of patient-centered health care homes and other emerging models of care delivery. Currently, 21 states and the District of Columbia have adopted FPA for NPs, with 15 more states planning legislation in 2016. Allowing FPA and Prescriptive Authority (PA) enables NPs to become more efficient and effective patient care team members. However, physician resistance to FPA and PA presents barriers to implementation.
          Evidence-based Practice: How Nurse Leaders can Facilitate Innovation      Cache   Translate Page      
imageEvidence-based nursing practice (EBNP) is the wave of the future. Increasingly, EBNP is being identified as a key to quality and excellence in nursing services. Incorporating evidence into practice is necessary to deliver scientifically sound patient care. In addition, understanding the importance of evidence is crucial for meeting the excellence requirements of Magnet designation. Despite the growing popularity of EBNP and its documented significant benefits, the literature demonstrates that only 15% of the nursing workforce consistently practices within an EBNP framework. If EBNP adoption is to increase in the profession, it will require the active efforts of nurse leaders to pursue an aggressive innovation diffusion strategy. The purpose of this article is to discuss the nurse leader's role in facilitating EBNP in nursing using a theoretical framework grounded in innovation diffusion theory. The article develops 4 areas of focus. First, the components of innovation diffusion theory are discussed. Second, a pertinent empirical review of the EBNP adoption literature is presented. Third, strategies for applying innovation diffusion theory to facilitate EBNP adoption are proposed. Lastly, the article ends with a leadership call to action.
          Office 365 & Azure AD Connect      Cache   Translate Page      
I’ve finally got around to installing and configuring Azure AD Connect, and I can see from my test OU that my users are being synced successfully to O365. A bit of my school’s network background: * The FQDN is schoolname.internal * Our domain is schoolname.county.sch.uk * I’ve created a...
          Escalation Engineer - Microsoft - Bengaluru, Karnataka      Cache   Translate Page      
Leading engineering investigations to bring quicker issue resolution. Azure’s continued success depends on providing customers a world class support experience....
From Microsoft - Fri, 12 Oct 2018 15:24:05 GMT - View all Bengaluru, Karnataka jobs
          Fujitsu underlättar molnresan med PRIMEFLEX for Microsoft Azure Stack      Cache   Translate Page      
PRIMEFLEX för Microsoft Azure Stack är en lösning som passar för företag som vill utnyttja alla fördelar med det offentliga molnet men som, av olika anledningar, samtidigt behöver behålla delar av sin affärskritiska IT under eget tak.
          Hadoop Admin - Syntel Inc - Madison, WI      Cache   Translate Page      
Azure blobs storage, Azure ML studio, Azcopy, OMS and Databricks Networking- VMs, Vnets, Subnets, Azure SQL Datawarehouse. We are looking for *Hadoop**Admin*....
From Indeed - Wed, 10 Oct 2018 20:42:08 GMT - View all Madison, WI jobs
          Principal Software Engineering Manager - Microsoft - Redmond, WA      Cache   Translate Page      
Experience in Azure based analytics, storage, and reporting – Azure data lake, Azure Datawarehouse, HD Insight, Azure Data Factory....
From Microsoft - Sat, 27 Oct 2018 03:43:14 GMT - View all Redmond, WA jobs
          Principal Software Development Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
The Azure SQL Datawarehouse Service team is looking for a highly qualified, experienced individual to help us shape and implement our next generation data...
From Microsoft - Sat, 13 Oct 2018 03:12:19 GMT - View all Redmond, WA jobs
          Sr. Software Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
Experience in Azure based analytics, storage, and reporting – Azure data lake, Azure Datawarehouse, HD Insight, Azure Data Factory....
From Microsoft - Tue, 09 Oct 2018 15:27:08 GMT - View all Redmond, WA jobs
          Senior SWE Manager - Microsoft - Issaquah, WA      Cache   Translate Page      
Experience in Azure based analytics, storage, and reporting – Azure data lake, Azure Datawarehouse, HD Insight, Azure Data Factory....
From Microsoft - Fri, 19 Oct 2018 05:03:40 GMT - View all Issaquah, WA jobs
          Azure Big Data Engineer - Accenture - Milwaukee, WI      Cache   Translate Page      
Minimum 6 monthsof designing, building and operationalizing large scaleenterprise data solutionsand applications using one or more of AZUREdata and analytics...
From Indeed - Thu, 25 Oct 2018 11:15:29 GMT - View all Milwaukee, WI jobs
          Programador J2EE con Azure - Rawson BPO - Madrid, España      Cache   Translate Page      
En Rawson BPO seleccionamos Programador de microservicios en Azure Cloud para formar parte de un importante proyecto IT. Conocimientos de Azure, SpringBoot y Java. Madrid capital - Calle Serrano. Proyecto ESTABLE - contrato INDEFINIDO. Salario negociable en función de la experiencia aportada por cada candidato.
          SQL Developer      Cache   Translate Page      
MN-Eagan, Description: The SQL Developer’s role is to develop and maintain databases that provide backend for Web Applications across the organization, while ensuring high levels of data availability. The SQL Developer will work closely with cloud application developers to build fast and efficient solutions, evaluate and advise on technology components for database management systems utilizing Azure Cloud S
          SF2789 Geometric Fuschia Azure - Ferragamo Rx      Cache   Translate Page      
  • Frame Material: Plastic
  • Lens Material: Plastic
  • Lens Width: 52mm
  • Bridge: 15mm
  • Arm: 140mm
  • Made in Italy
  • 100% UV Protection
Ferragamo Eyeglasses, style SF2789 is a square acetate full-eye frame. The SF27..

Price: $89.99


          Crude Oil Technical Analysis: Black Gold continues its descent as bearish EIA data sends WTI to $61.00 a barrel      Cache   Translate Page      
  • Crude oil is trading in a strong bear trend below the 50, 100 and 200-period simple moving average on the 4-hour chart.
  • In reaction to the EIA data, oil lost almost $1 in a few minutes. Oil inventories rose to 5.7M versus 2.43M in the week to November 2 which was seen as very bearish by the market. 
  • Oil is now continuing its descent reaching the 61.00 figure. While some consolidation up towards 62.00 can take place the momentum remains clearly to the downside. 

Crude oil WTI 4-hour chart

Main Trend:              Bearish

Resistance 1:           61.81 April 6 low
Resistance 2:           63.00 figure
Resistance 3:           63.59 June 18 low
Resistance 4:           64.00 figure

Support 1:             61.00 figure 
Support 2:             59.95 March 8 low
Support 3:             58.07 February 9 low
Support 4:             55.82 December 7, 2017


Additional key levels at a glance:

GBP/USD

Overview:
    Last Price: 1.3134
    Daily change: 37 pips
    Daily change: 0.283%
    Daily Open: 1.3097
Trends:
    Daily SMA20: 1.2998
    Daily SMA50: 1.3029
    Daily SMA100: 1.3039
    Daily SMA200: 1.3419
Levels:
    Daily High: 1.31
    Daily Low: 1.3021
    Weekly High: 1.3042
    Weekly Low: 1.2696
    Monthly High: 1.326
    Monthly Low: 1.2696
    Daily Fibonacci 38.2%: 1.307
    Daily Fibonacci 61.8%: 1.3051
    Daily Pivot Point S1: 1.3045
    Daily Pivot Point S2: 1.2993
    Daily Pivot Point S3: 1.2966
    Daily Pivot Point R1: 1.3125
    Daily Pivot Point R2: 1.3152
    Daily Pivot Point R3: 1.3204

 


          United States EIA Crude Oil Stocks change registered at 5.78M above expectations (2.43M) in November 2      Cache   Translate Page      

          EUR/USD Technical Analysis: Euro bulls find support at 1.460 level as new buyers join the bull party      Cache   Translate Page      
  • EUR/USD is trading in a bear trend below the 200-period simple moving average on the 4-hour chart.
  • EUR/USD is riding up on the bullish momentum. New York bulls showed up near 1.1470 and are helping to support the market. The next big hurdle to overcome is 1.1500 figure. 
  • A break of the level should open the gates 1.1600 figure in the coming sessions. 

EUR/USD 4-hour chart

Main trend:             Bearish
Short-term trend:    Bullish

Resistance 1:   1.1500 figure and October 2 swing low 
Resistance 2:   1.1530 August 23 swing low (key level)
Resistance 3:   1.1600 figure

Support 1:   1.1456 November 5 high
Support 2:   1.1430 October 9 low
Support 3:   1.1400 figure
Support 4:   1.1350 figure
Support 5:   1.1300 current 2018 low


Additional key levels at a glance:

EUR/USD

Overview:
    Last Price: 1.1478
    Daily change: 60 pips
    Daily change: 0.525%
    Daily Open: 1.1418
Trends:
    Daily SMA20: 1.1453
    Daily SMA50: 1.1557
    Daily SMA100: 1.1586
    Daily SMA200: 1.1857
Levels:
    Daily High: 1.1438
    Daily Low: 1.1392
    Weekly High: 1.1456
    Weekly Low: 1.1302
    Monthly High: 1.1625
    Monthly Low: 1.1302
    Daily Fibonacci 38.2%: 1.142
    Daily Fibonacci 61.8%: 1.1409
    Daily Pivot Point S1: 1.1394
    Daily Pivot Point S2: 1.1369
    Daily Pivot Point S3: 1.1347
    Daily Pivot Point R1: 1.144
    Daily Pivot Point R2: 1.1462
    Daily Pivot Point R3: 1.1487

 


          EUR/USD Technical Analysis: Euro finds support at 1.470 level as new bulls join the up move      Cache   Translate Page      
  • EUR/USD is trading in a bear trend below the 200-period simple moving average on the 4-hour chart.
  • EUR/USD is riding up on the bullish momentum. New York bulls showed up near 1.1470 and are helping to support the market. The next big hurdle to overcome is 1.1500 figure. 
  • A break of the level should open the gates 1.1600 figure in the coming sessions. 

EUR/USD 4-hour chart

Main trend:             Bearish
Short-term trend:    Bullish

Resistance 1:   1.1500 figure and October 2 swing low 
Resistance 2:   1.1530 August 23 swing low (key level)
Resistance 3:   1.1600 figure

Support 1:   1.1456 November 5 high
Support 2:   1.1430 October 9 low
Support 3:   1.1400 figure
Support 4:   1.1350 figure
Support 5:   1.1300 current 2018 low


Additional key levels at a glance:

EUR/USD

Overview:
    Last Price: 1.1478
    Daily change: 60 pips
    Daily change: 0.525%
    Daily Open: 1.1418
Trends:
    Daily SMA20: 1.1453
    Daily SMA50: 1.1557
    Daily SMA100: 1.1586
    Daily SMA200: 1.1857
Levels:
    Daily High: 1.1438
    Daily Low: 1.1392
    Weekly High: 1.1456
    Weekly Low: 1.1302
    Monthly High: 1.1625
    Monthly Low: 1.1302
    Daily Fibonacci 38.2%: 1.142
    Daily Fibonacci 61.8%: 1.1409
    Daily Pivot Point S1: 1.1394
    Daily Pivot Point S2: 1.1369
    Daily Pivot Point S3: 1.1347
    Daily Pivot Point R1: 1.144
    Daily Pivot Point R2: 1.1462
    Daily Pivot Point R3: 1.1487

 


          Canada Ivey Purchasing Managers Index s.a came in at 61.8, above forecasts (50.9) in October      Cache   Translate Page      

          Canada Ivey Purchasing Managers Index rose from previous 56.5 to 64.6 in October      Cache   Translate Page      

          Turkey Treasury Cash Balance increased to -1.47B in October from previous -8.18B      Cache   Translate Page      

          Gold Technical Analysis: Yellow Metal falling fast from daily highs and breaking below $1,230.00/oz      Cache   Translate Page      
  • Gold is trading in a bull trend above its 200-period simple moving average (SMA) on the 4-hour chart.
  • Gold is retreating from daily highs and testing the 50 and 100 SMA as it broke below $1,230.00 a troy ounce. The RSI, MACD and Stochastic are decelerating suggesting further losses ahead.
  • The next objective for bears is likely located near 1,220.90 (July 18 low).

Gold 4-hour chart


Main trend:                 Bullish


Resistance 1:            1,237.60 July 3 swing low
Resistance 2:            1,250.00 figure
Resistance 3:            1,265.90 July high
Resistance 4:            1,300.00 figure

Support 1:            1,220.90 July 18 low
Support 2:            1,211.17 July 19 low 
Support 3:            1,204.10, August 3 swing low (key level)
Support 4:            1,182.90 August 24 low
Support 5:            1,172.82 current 2018 low

 

Additional key levels at a glance:

XAU/USD

Overview:
    Last Price: 1228.6
    Daily change: 2.0e+2 pips
    Daily change: 0.161%
    Daily Open: 1226.62
Trends:
    Daily SMA20: 1227.86
    Daily SMA50: 1211.76
    Daily SMA100: 1207.15
    Daily SMA200: 1245.57
Levels:
    Daily High: 1235.94
    Daily Low: 1223.2
    Weekly High: 1237.6
    Weekly Low: 1211.8
    Monthly High: 1243.43
    Monthly Low: 1182.54
    Daily Fibonacci 38.2%: 1228.07
    Daily Fibonacci 61.8%: 1231.07
    Daily Pivot Point S1: 1221.23
    Daily Pivot Point S2: 1215.85
    Daily Pivot Point S3: 1208.49
    Daily Pivot Point R1: 1233.97
    Daily Pivot Point R2: 1241.33
    Daily Pivot Point R3: 1246.71

 


          Sr Principal Infra Engg - Mphasis - Bengaluru, Karnataka      Cache   Translate Page      
Primary Comp & : Cloud Computing - Azure Architect Skill Percentage - 100 Skill Version - 1 Proficiency - Advanced(>5 & <=9yrs) EDUCATION : ITO NICHE SKILLS -...
From Mphasis - Tue, 06 Nov 2018 12:28:56 GMT - View all Bengaluru, Karnataka jobs
          Processing Github Webhooks with Azure Functions      Cache   Translate Page      
I recently had a need to react to a push event on a GitHub repository and do some data processing on what was pushed. GitHub expose webhooks for a wide range of events so I looked into using Azure Functions and webhooks to respond to the GitHub push event. This turned out to be very...
          Source Control for Data Science – using Azure DevOps / VSTS with Jupyter Notebooks      Cache   Translate Page      
So many of you will know about https://mybinder.org/ Binder is a awesome tool that allows you turn a Git repo into a collection of interactive Jupyter notebooks and it allows you to, open those notebooks in an executable environment, making your code immediately reproducible by anyone, anywhere. Jupyter Notebooks in the cloud Another great interactive...
          Azure Data Architecture Guide – Blog #3: Advanced analytics and deep learning      Cache   Translate Page      
We'll continue to explore the Azure Data Architecture Guide with our third blog entry in this series. The previous entries for this blog series are: Azure Data Architecture Guide – Blog #1: Introduction Azure Data Architecture Guide – Blog #2: On-demand big data analytics Like the previous post, we'll work from a technology implementation seen directly in...
          Sr Content Developer, Azure Data Engineering - Microsoft - Redmond, WA      Cache   Translate Page      
Technical training, and/or instructional design knowledge or experience highly valued. Work with related Microsoft Technical and Business Groups to curate...
From Microsoft - Tue, 06 Nov 2018 12:37:40 GMT - View all Redmond, WA jobs
          Sr Content Developer, Azure Data Scientist - Microsoft - Redmond, WA      Cache   Translate Page      
Technical training, and/or instructional design knowledge or experience highly valued. Work with related Microsoft Technical and Business Groups to curate...
From Microsoft - Tue, 06 Nov 2018 12:37:40 GMT - View all Redmond, WA jobs
          Azure CVP Charlotte Yarkoni: We all have the power to fly—find your wings with lifelong learning      Cache   Translate Page      

The post Azure CVP Charlotte Yarkoni: We all have the power to fly—find your wings with lifelong learning appeared first on Stories.


          November 2012 Chicago IT Architects Group Meeting Recap      Cache   Translate Page      

Originally posted on: http://blog.geekypedia.net/archive/2012/11/21/november-2012-chicago-it-architects-group-meeting-recap.aspx

So the year is coming to an end.  A hearty few came out two days before Thanksgiving to discuss adopting agile in the enterprise.  While Norm Murrin claimed to be nervous about talking in front of a group your wouldn’t have known by his presentation.  He really made a topic that has always been hard to relate very personal.  This lead to some great discussion.  I came out of looking for ways to investigate agile further.  His presentation can be found here.

This was our last meeting for the year.  We are looking forward to next year and are starting to line up some speakers and topics.  At this point we have an Azure presentation coming in February and are ironing out talks for January and March.  If your would like to join us and have topics you would like to see presented contact me through this blog.  Either leave a comment here or use the contact page.  I would love to hear from you.

Have a great holiday season and we will see you next year.


          世界互联网领先科技成果发布:Azure Sphere-基于微控制器的物联网安全解决方案      Cache   Translate Page      

人民网乌镇11月7日电(记者 燕帅 赵光霞)第五届世界互联网大会世界互联网领先科技成果发布活动今日下午在乌镇互联网国际会展中心举行。沈向洋现场发布Azure Sphere-基于微控制器的物联网安全解决方案:

Azure Sphere是一款打造高度安全的、连接互联网的微处理器设备。它包含Azure Sphere认证微处理器、Azure Sphere操作系统和Azure Sphere安全服务三大组件,协力保护和助力数十亿的设备。Azure Sphere认证微处理器用于提供连接和可靠的硬件信任根,Azure Sphere操作系统用于建立物联网可信体验平台,Azure Sphere安全服务用于保护每台 Azure Sphere设备。


          第五届世界互联网大会发布15项世界互联网领先科技成果      Cache   Translate Page      

人民网乌镇11月7日电(记者 燕帅 赵光霞)第五届世界互联网大会世界互联网领先科技成果发布活动今日下午在乌镇互联网国际会展中心举行,发布活动由中国工程院院士、中国互联网协会理事长邬贺铨主持,活动现场发布了15项世界互联网领先科技成果:

1、微信小程序商业模式创新

2、华为昇腾310人工智能处理器

3、蚂蚁金服自主可控的金融级商用区块链平台

4、破解信息孤岛的接口高效互操作技术与燕云DaaS系统

5、Amazon SageMaker

6、360安全大脑-分布式智能网络安全防御系统

7、智能供应链技术服务平台

8、Apollo自动驾驶开放平台

9、Arm China Al Platform Zhouyi

10、特斯拉智能售后服务

11、supET工业互联网平台

12、全球首款全集成5G新空口毫米波及6GHz以下射频模组

13、CPU硬件安全动态监测管控技术

14、Azure Sphere-基于微控制器的物联网安全解决方案

15、小米面向智能家居的人工智能开放平台 


          ACR Appropriateness Criteria® Local-regional Recurrence (LR) and Salvage Surgery: Breast Cancer      Cache   Translate Page      
imageDespite the success of both breast conserving surgery and mastectomy, some women will experience a local-regional recurrence (LRR) of their breast cancer. Predictors for LRR after breast-conserving therapy or mastectomy have been identified, including patient, tumor, and treatment-related factors. The role of surgery, radiation, and chemotherapy as treatment has evolved over time and many patients now have the potential for salvage after LRR. This review of LRR of breast cancer and management recommendations, including the use of common clinical scenarios, represents a compilation of evidence-based data and expert opinion of the American College of Radiology Appropriateness Criteria Expert Panel on local-regional recurrence. The American College of Radiology Appropriateness Criteria are evidence-based guidelines for specific clinical conditions that are reviewed every 2 years by a multidisciplinary expert panel. The guideline development and review include an extensive analysis of current medical literature from peer-reviewed journals and the application of a well-established consensus methodology (modified Delphi) to rate the appropriateness of imaging and treatment procedures by the panel. In those instances where evidence is lacking or not definitive, expert opinion may be used to recommend imaging or treatment.
          Targeting Tumor Metabolism With Statins During Treatment for Advanced-stage Pancreatic Cancer      Cache   Translate Page      
imageIntroduction: A growing body of preclinical data suggests that statins may exert potent antitumor effects, yet the interactions of these medications with standard therapies and clinical outcomes in this population is less clear. We assessed the impact of statin use on outcomes in patients with advanced-stage pancreatic adenocarcinoma undergoing various treatments. Materials and Methods: After institutional review board approval, we conducted a retrospective-cohort study consisting of 303 newly diagnosed advanced-stage pancreatic adenocarcinoma patients to determine the impact of statin use on outcomes. Univariate and multivariable Cox proportional hazard regression models were utilized to estimate hazard ratios (HRs). Time-to-event was estimated using Kaplan-Meier survival analysis for overall survival, distant metastasis, and locoregional failure. Baseline and active statin usage were assessed and to mitigate risk of immortal time bias, subanalysis excluding patients with under 6 months of follow-up was conducted. Results: Both prior (P=0.021) and active (P=0.030) statin usage correlated with improved survival in this cohort. Surgery, chemoradiation, and statin use improved 2-year survival rates (84.1% vs. 55.0%; P<0.001). On multivariable analysis, statin exposure was associated with overall survival (HR, 0.662; P=0.027) and trended to significance for freedom from distant metastasis (HR, 0.577; P=0.060). Comorbid conditions were not significantly associated with outcomes. Conclusions: Statin use was associated with improved overall survival in advanced-stage pancreatic adenocarcinoma patients. This data supports previous findings in early-stage pancreatic adenocarcinoma and other cancer sites. To our knowledge this is the first report to examine the efficacy of statin use as a supplementary treatment option in advanced-stage pancreatic adenocarcinoma patients.
          Treatment of Adult Rhabdomyosarcoma      Cache   Translate Page      
imageObjectives: Rhabdomyosarcoma is an exceedingly rare tumor in adults, and standard chemotherapy used for children is much less effective in adults. This study examines short-term outcomes using doxorubicin, ifosfamide, and vincristine for adult rhabdomyosarcoma. Methods: Pathology records were searched for adults (age, >18) with rhabdomyosarcoma treated at our musculoskeletal tumor center. Treatment involved surgical resection, radiation therapy, and chemotherapy with doxorubicin, ifosfamide, and vincristine. Eleven met inclusion criteria. Mean age was 49 (range: 19–72). Tumors sites included upper extremity (4 patients), lower extremity (6), and cervix (1). Subtypes were pleomorphic (7), alveolar (1), embryonal (1), and mixed alveolar/embryonal (2). Results: Of the 7 patients with nonmetastatic disease, 6 had no evidence of disease posttreatment, but 1 died of myelodysplastic syndrome after 51 months. Three patients who received neoadjuvant chemotherapy had 100% tumor necrosis. One patient with positive margins scheduled for adjuvant chemotherapy had local recurrence and metastasis within 2 weeks and died 5 months later. Of the 4 patients with metastatic disease on presentation, 1 had complete response, 2 had partial response with later progression and death at 8 and 24 months, and 1 had immediate progression and died at 12 months. Mean overall survival was 24 months with 6 of 11 (55%) alive at last follow-up. Mean disease-free survival was 17 months for all patients and 23 months for the 7 patients who had remission of all disease. Conclusions: When combined with surgery and radiation therapy, chemotherapy using doxorubicin, ifosfamide, and vincristine yielded 55% overall and 64% disease-free survival at 2 years.
          IT Infrastructure Engineer - Vire Tech - Lahore      Cache   Translate Page      
Design, implementation and administration of Windows 2008r2/2012r2/2016, MS-Azure, Microsoft Exchange, and Active Directory....
From Vire Tech - Wed, 31 Oct 2018 19:54:32 GMT - View all Lahore jobs
          My tweets      Cache   Translate Page      

          Acronis Data Cloud 7.8, protezione per Office 365 e il backup va su Google Cloud e Azure      Cache   Translate Page      
Acronis Data Cloud 7.8 permette il backup cloud-to-cloud per Exchange Online, OneDrive for Business e SharePoint Online. Con Acronis Notary Cloud arriva un servizio basato su blockchain per l'autenticazione dei file, la firma elettronica e la verifica dei dati
          Azure CTO Mark Russinovich's top ten public cloud security risks      Cache   Translate Page      
none
          Microsoft admits running out of IP addresses for Azure      Cache   Translate Page      
none
          Huawei ve Microsoft “Bulut Bilişim” için güçlerini birleştiriyor       Cache   Translate Page      

Huawei ve Microsoft, işletmelerin hibrit bulut hizmetlerini hızlandırmasına, veri güvenilirliğini artırmasına ve son kullanıcı deneyimini geliştirmesine yardımcı olan tam flash Azure Stack çözümünü piyasaya sürdü.
Devamı için Tıklayınız...
          Sr Content Developer, Azure Data Engineering - Microsoft - Redmond, WA      Cache   Translate Page      
Technical training, and/or instructional design knowledge or experience highly valued. Work with related Microsoft Technical and Business Groups to curate...
From Microsoft - Tue, 06 Nov 2018 12:37:40 GMT - View all Redmond, WA jobs
          Sr Content Developer, Azure Data Scientist - Microsoft - Redmond, WA      Cache   Translate Page      
Technical training, and/or instructional design knowledge or experience highly valued. Work with related Microsoft Technical and Business Groups to curate...
From Microsoft - Tue, 06 Nov 2018 12:37:40 GMT - View all Redmond, WA jobs
          Azure Event Hubs for Apache Kafka | Azure Friday      Cache   Translate Page      

Shubha Vijayasarathy joins Scott Hanselman to discuss Azure Event Hubs, which makes data ingestion simple, secure, and scalable. As a distributed streaming platform, Event Hubs enables you to stream your data from any source—storing and processing millions of events per second— so you can build dynamic data pipelines and respond to business challenges in real time.

With Azure Event Hubs for Apache Kafka, we're bringing together two powerful distributed streaming platforms, so you can access the breadth of Kafka ecosystem applications without having to manage servers or networks. Event Hubs for Kafka provides a Kafka endpoint so that any Kafka client running Kafka 1.0 or newer protocols can publish/subscribe events to/from Event Hubs with a simple configuration change.


          SAP HANA infrastructure automation with Terraform and Ansible | Azure Friday      Cache   Translate Page      

Page Bowers and Donovan Brown discuss how SAP customers are moving to Azure to take advantage of SAP-certified HANA virtual machines such as Azure M-series. Learn how you can use Terraform and Ansible to speed up SAP HANA deployments on Azure in 30 minutes as opposed to hours or days.

For more information:


          Sr Content Developer, Azure Data Engineering - Microsoft - Redmond, WA      Cache   Translate Page      
Technical training, and/or instructional design knowledge or experience highly valued. Work with related Microsoft Technical and Business Groups to curate...
From Microsoft - Tue, 06 Nov 2018 12:37:40 GMT - View all Redmond, WA jobs
          Sr Content Developer, Azure Data Scientist - Microsoft - Redmond, WA      Cache   Translate Page      
Technical training, and/or instructional design knowledge or experience highly valued. Work with related Microsoft Technical and Business Groups to curate...
From Microsoft - Tue, 06 Nov 2018 12:37:40 GMT - View all Redmond, WA jobs
          Sr Principal Infra Engg - Mphasis - Bengaluru, Karnataka      Cache   Translate Page      
Primary Comp & : Cloud Computing - Azure Architect Skill Percentage - 100 Skill Version - 1 Proficiency - Advanced(>5 & <=9yrs) EDUCATION : ITO NICHE SKILLS -...
From Mphasis - Tue, 06 Nov 2018 12:28:56 GMT - View all Bengaluru, Karnataka jobs
          Apartamento en Venta en Mazuren, Bogota      Cache   Translate Page      
300000000
Comodo y acogedor apartamento en zona tranquila a 2 cuadras Del Centro Comercial Mazuren; con facil acceso al transporte publico, rodeado de una diversidad de comercios y colegio lo que permiten a las familias una excelente calidad de vida, el...
3 habitaciones 2 baños bien comunicado
Tue, 06 Nov 2018 15:31:38 +0100
          Offer - Speakeasy Therapy Services, LLC - USA      Cache   Translate Page      
Speak Easy provides therapy services to those who cannot speak properly. We serve patients with our Las Vegas speech therapy, Las Vegas pediatric speech therapy, and oral myofunctional therapy. Our therapy includes both voice and speech/language therapy readily available for both adults and kids. The entire therapy process may include the correction or improvement of voice, language, accent, speech, and even dysphagia. We also offer feeding therapy and pediatric rehabilitation.Address: 7425 W Azure Dr, Ste 140, Las Vegas, NV 89016, United StatesWebsite URL: http://www.speakeasytherapylv.org/Business Email: speach_therapy@yopmail.com
          Programador J2EE con Azure - Rawson BPO - Madrid, España      Cache   Translate Page      
En Rawson BPO seleccionamos Programador de microservicios en Azure Cloud para formar parte de un importante proyecto IT. Conocimientos de Azure, SpringBoot y Java. Madrid capital - Calle Serrano. Proyecto ESTABLE - contrato INDEFINIDO. Salario negociable en función de la experiencia aportada por cada candidato.
          Το Moovit θα παρέχει δεδομένα δημόσιας συγκοινωνίας για τους χάρτες Microsoft Azure      Cache   Translate Page      
Το Moovit, ανακοίνωσε ότι θα ενσωματώσει τις πληροφορίες δημόσιας συγκοινωνίας στους Χάρτες Azure για να βοηθήσει τους προγραμματιστές να δημιουργήσουν πλουσιότερες εφαρμογές για δισεκατομμύρια ανθρώπους που μετακινούνται σε όλο τον κόσμο. Στο πλαίσιο της συνεργασίας, η Moovit θα τρέχει τα δημόσια δεδομένα συγκοινωνίας της και υπηρεσίες APIs σε Microsoft Azure και θα μεταφέρει σταδιακά τα
          Natural Amazonite Multiple Color Graduated Faceted Round Beads, 15.5-Inch Strand G01224 by MapleRanch      Cache   Translate Page      

7.99 USD

Sold for 15.5-inch strand.

This semi-opaque blue-green variety of feldspar is named after the Amazon River.

Amazonite balances feminine and masculine energy. It promotes kindness and practicality. It is an excellent stone for artists and for men.

Pale azure blue amazonite is know as the lucky "Hope Stone". It will be lucky for all your hopes and dreams. Amazonite is blue-green to pale green stone in the feldspar group. It comes mainly from United States and Australia.

Amazonite helps balance the emotions and gives physical stamina.

Please refer to
http://en.wikipedia.org/wiki/Amazonite
for more information about this beautiful stone.

Please feel free to contact us for additional product information or quantity order quotes.


          Talent management best practices: How exemplary health care organizations create value in a down economy      Cache   Translate Page      
imageBackground: Difficult economic conditions and powerful workforce trends pose significant challenges to managing talent in health care organizations. Although robust research evidence supports the many benefits of maintaining a strong commitment to talent management practices despite these challenges, many organizations compound the problem by resorting to workforce reductions and limiting or eliminating investments in talent management. Purpose: This study examines how nationwide health care systems address these challenges through best practice talent management systems. Addressing important gaps in talent management theory and practice, this study develops a best practice model of talent management that is grounded in the contextual challenges facing health care practitioners. Methodology: Utilizing a qualitative case study that examined 15 nationwide health care systems, data were collected through semistructured interviews with 30 executives and document analysis of talent management program materials submitted by each organization. Findings: Exemplary health care organizations employ a multiphased talent management system composed of six sequential phases and associated success factors that drive effective implementation. Based on these findings, a model of talent management best practices in health care organizations is presented. Practice Implications: Health care practitioners may utilize the best practice model to assess and enhance their respective talent management systems by establishing the business case for talent management, defining, identifying, and developing high-potential leaders, carefully communicating high-potential designations, and evaluating talent management outcomes.
          The Impact of Ethical Climate on Job Satisfaction of Nurses      Cache   Translate Page      
imageThis article examines the impact of ethical climate types (shared perception of how ethical issues should be addressed and what is ethically correct behavior) on various facets of job satisfaction of nurses in a large nonprofit private hospital. The results of the study indicate that hospitals may be able to enhance job satisfaction of nurses by influencing the organization's ethical climate.
          Disseminated histoplasmosis in five immunosuppressed patients: clinical, diagnostic, and therapeutic perspectives      Cache   Translate Page      
imageDisseminated histoplasmosis is a relatively uncommon manifestation of a disease that primarily affects immunocompromised hosts. Five immunosuppressed patients (four with AIDS and one with a liver transplant) presented with fever, pancytopenia, markedly elevated lactate dehydrogenase (LDH), and nodular pulmonary infiltrates (one miliary pattern). One patient had concomitant diffuse papular and purpuric skin lesions. All five originated from areas of Histoplasma endemicity (Puerto Rico, El Salvador, Brazil, and the Dominican Republic). While histoplasmosis was suspected clinically and epidemiologically, diagnosis was primarily achieved by visualization of phagocytosed Histoplasma yeast cells in peripheral blood smears, broncheoalveolar lavage, and in biopsy specimens. All four AIDS patients showed elevated urine Histoplasma antigen and LDH levels, whereas the liver transplant recipient had a false negative urine Histoplasma antigen and a normal LDH. With the exception of one AIDS patient (in whom diagnosis was delayed), all responded to induction therapy with amphotericin B followed by itraconazole. Disseminated histoplasmosis should be suspected in immunosuppressed individuals who originate from areas of endemicity and present with pancytopenia, fevers, nodular infiltrates, and elevated LDH.
          Plesiomonas shigelloides: an emerging pathogen with unusual properties      Cache   Translate Page      
imagePlesiomonas shigelloides is a Gram-negative, motile and oxidase-positive bacterium that is widely distributed in nature. It is also a significant pathogen causing mainly intestinal diseases in humans. Diarrhoea is the leading symptom of most of these infections and may occur as a watery, invasive and chronic form. Food- and water-borne outbreaks of intestinal infections due to Plesiomonas have been reported. P. shigelloides also causes a variety of extraintestinal infections with high mortality rates; sepsis and meningitis represent the most common extraintestinal forms of disease. In spite of its close phylogenetic relationship to other Enterobacteriaceae, P. shigelloides is biochemically distant from other species of this family. One biovar, but more than 100 serovars have been described. P. shigelloides is thermo-, alkali-, acido- and halotolerant and has been discussed as a ‘natural’ vaccine against shigellosis. In the laboratory, Plesiomonas appears inconspicuous on the surface of several agar plates, and some enteric media are known to inhibit its growth. Plesiomonas shows unusual antibiotic susceptibility patterns, its susceptibility to some agents is highly dependent on inoculum size. The formation of extensive cell filamentation at high bacterial densities in the presence of certain β-lactam antibiotics is distinctive.
          Fatal multiple organ failure in an adolescent due to community-acquired methicillin-susceptible Staphylococcus aureus ST121/agrIV lineage: case report and review      Cache   Translate Page      
imageAdolescent deaths due to Staphylococcus aureus infection are rare. We describe the case of a 17-year-old boy who died from septic multiple organ failure caused by community-acquired methicillin-susceptible S. aureus. The patient's first clinical symptom was a skin and soft tissue infection, with multiple organ failure developing 5 days afterwards. The causative organism S. aureus was further determined to carry Panton-Valentine leukocidin genes, and molecularly characterized as staphylococcal protein A (spa) type t159, multilocus sequence type (ST)121, and accessory gene regular (agr) type IV. ST121/agrIV S. aureus is one of the minor lineages prevailing in China, and this is the first fatal case reported due to ST121/agrIV lineage of community-acquired methicillin-susceptible S. aureus in the country.
          Molecular diagnostic methods for the detection of Neisseria gonorrhoeae and Chlamydia trachomatis      Cache   Translate Page      
imageAmplification methods for detection of Chlamydia trachomatis and Neisseria gonorrhoeae in genitourinary specimens have changed the way in which routine Clinical Microbiology laboratories fulfill requests for recovery of these two common agents of sexually transmitted disease. PCR, strand displacement amplification, transcription mediated amplification and signal amplification (Hybrid Capture) are available as commercial products that can be used to assay endocervical, urethral, and urine samples from female and male patients for screening in high-risk asymptomatic patients as well as for diagnosis in the symptomatic patient. Although the methods differ, the overall results are increased sensitivity and rapid detection as compared to conventional means of detection of C. trachomatis and N. gonorrhoeae. There are some limitations to the use of amplification, for example, in cases of sexual abuse, in requests for a test-of-cure within 2–3 weeks of initial detection, and in non-genitourinary specimens. Use of the assays is beginning to change the epidemiology of these sexually transmitted diseases, especially in the increased numbers of C. trachomatis that have been reported to the Centers for Disease Control in the US over the past several years.
          Azure Sales and Marketing - Ingram Micro Cloud - Bellevue, WA      Cache   Translate Page      
If so, join the Ingram Micro Cloud team - where rainmakers thrive. Are you an innovative, self-starting Marketing guru who loves helping IT providers design and...
From Ingram Micro Cloud - Fri, 28 Sep 2018 07:14:09 GMT - View all Bellevue, WA jobs
          Sql server dba      Cache   Translate Page      
We have an urgent opening with 1 of our esteem client in Pune. We are looking for applicants with four to 10 yrs of background in SQL Server DBA with strong background in Azure/AWS, Database Architect, Oracle. Immediate to 30 days joiners only. Interested applicants can apply by sharing below required info Total Background - Relevant Background - Current Company - Joining Period - Current CTC...
          Product developer      Cache   Translate Page      
Job Description3-5yrs of Development background Python Programming SQL Programming Javascript Development Web development framework like Flask or Django Background application development and troubleshooting in cloud environment. Preferably, GCP. If not, background of development in AWS and Azure environment should be fine Salary Not Disclosed by RecruiterDesired Applicant Profile Please refer to...
          Allround IT Specialist      Cache   Translate Page      
Allround IT Engineer - 6 Months - The Hague Oracle, SQL, Network, Firewalls, Wifi, Windows, Linux, Azure, SaaS Amoria Bond is looking for an experienced allround IT engineer for one of our clients in The Hague. You will be responsible for the complete application landscape and Infrastructure...
          .NET Developer      Cache   Translate Page      
Voor een Fin-tech start-up ben ik per direct op zoek naar een .NET ontwikkelaar die kennis heeft van onderstaande technieken. Daarnaast is het van belang dat je mee kan denken in de architectuur. Technieken: - .NET Core - C# - AWS (of Azure) - PostgreSQL - Docker - TFS/GIT - CI/CD - Entity Framework De klant kan snel schakelen...
          Cloud-Computing: Fallstudien zu Microsoft Azure      Cache   Translate Page      
Datenverarbeitung ist das Stichwort und die Herausforderung für die Unternehmen dieser Zeit. Sei es zur Verwaltung, fürs Marketing oder um den Kunden ein besseres und individuelleres Markenerlebnis zu präsentieren – hinter allem steckt die schnelle Erfassung, Aufbereitung und Bereitstellung von Datensätzen.
          Automating SAP deployments in Microsoft Azure using Terraform and Ansible      Cache   Translate Page      

Deploying complex SAP landscapes into a public cloud is not an easy task. While SAP basis teams tend to be very familiar with the traditional tasks of installing and configuring SAP systems on-premise, additional domain knowledge is often required to design, build, and test cloud deployments.

There are several options to take the guesswork out of tedious and error-prone SAP deployment projects into a public cloud:

  • One way to get started is the SAP Cloud Appliance Library (CAL), a repository of numerous SAP solutions that can be directly deployed into a public cloud. However, apart from its cost, CAL only contains pre-configured virtual machine (VM) images, so configuration changes are hard or impossible.
  • A free alternative has been to use SAP Quickstart Templates offered by most public cloud providers. Typically written in a shell script or a proprietary language, these templates offer some customization options for pre-defined SAP scenarios. For example, Azure’s ARM templates offer one-click deployments of SAP HANA and other solutions directly in Azure Portal.)

While both solutions are great starting points, they usually lack configuration options and flexibility required to build up an actual, production-ready SAP landscape.

Based on feedback from actual customers who move their SAP landscapes into the cloud, the truth is that existing Quickstart Templates rarely go beyond “playground” systems or proof-of-concepts; they are too rigid and offer too little flexibility to map real-life business and technical requirements.

This is why we, the SAP on Microsoft Azure Engineering team, decided to go into the opposite direction: Instead of offering “one-size-fits-all” templates for limited SAP scenarios that can hardly be adapted (let alone extended), we broke down SAP deployments in Azure to the most granular level and offer “building blocks” for a truly customizable, yet easy-to-use experience.

A new approach to automating SAP deployments in the cloud

In this new, modular approach to automating even more complex SAP deployments in Azure, we developed a coherent collection of:

  • Terraform modules which deploy the infrastructure components (such as VMs, network, storage) in Azure and then call the:
  • Ansible playbook which call different:
  • Ansible roles to install and configure OS and SAP applications on the deployed infrastructure in Azure.

A new approach to automating SAP deployments in the cloud

Flow diagram of Terraform/Ansible SAP automation templates.

An important design consideration was to keep all components as open and flexible as possible; although nearly every parameter on both Azure and SAP side can be customized, most are optional. In other words, you can be spinning up your first SAP deployment in Azure within 10 minutes by using one of our boilerplate configuration templates – but you can also use our modules and roles to build up a much more complex landscape.

 

Deployment

A sample deployment of HANA high-availability pair.

For your convenience, Terraform and Ansible are pre-installed in your Azure Cloud Shell, so the templates can be run directly from there with minimal configuration. Alternatively, you can, of course, use them from your local machine or any VM as well.

While the repository is published and maintained by Microsoft Azure, the project is community-driven and we welcome any contributions and feedback.

Starting with SAP HANA, but a lot more to come

When we started building our Terraform and Ansible templates a few months ago, we decided to start out our engineering process with HANA. SAP’s flagship in-memory database is the underlying platform and de-facto standard of most modern SAP enterprise applications, including S/4HANA and BW/4HANA. If you’ve ever built an SAP HANA high-availability cluster from scratch, you’ll appreciate that we’ve taken the guesswork out of this complex task and aligned our templates to the public cloud reference architectures certified by SAP.

Currently, our Terraform/Ansible templates support the following two options (more application-specific scenarios are currently being worked on):

HANA single-node instance

  • Single-node HANA instance.

Single-node HANA instance

HANA high-availability pair

  • Single-node HANA instance, two-tier replication (primary/secondary) via HSR.
  • Pacemaker high-availability cluster, fully configured with SBD and SAP/Azure resource agents.

HANA high-availability pair

Since our key focus was to offer the greatest amount of flexibility possible, virtually every aspect of the SAP HANA landscape can be customized, including:

  • Sizing (choose any supported Azure VM SKU).
  • High-availability (in the high-availability pair scenario, choose to use availability sets or availability zones).
  • Bastion host (optionally, choose from a Windows and/or Linux “jump box” including HANA Studio).
  • Version (currently, HANA 1.0 SPS12 and HANA 2.0 SPS2 or higher are supported).
  • XSA applications (optionally, enable XSA application server and choose from a set of supported applications like HANA Cockpit or SHINE).

04_XSA-SHINE

XSA SHINE demo content for HANA.

It’s worth noting that all scenarios come with “fill-in-the-blanks” boilerplate configuration templates and step-by-step instructions to help you get started.

Getting started is easy

Got a few minutes? In our popular Azure Friday series, our team member Page Bowers walks through a SAP HANA live deployment using our Terraform/Ansible templates.

Want to jump right in? Visit our GitHub repository and follow the “Getting Started” guide – you’ll be building up your first SAP landscapes in the Azure cloud in no time!


          Announcing the general availability of Azure Event Hubs for Apache Kafka®      Cache   Translate Page      

In today’s business environment, with the rapidly increasing volume of data and the growing pressure to respond to events in real-time, organizations need data-driven strategies to gain valuable insights faster and increase their competitive advantage. To meet these big data challenges, you need a massively scalable distributed streaming platform that supports multiple producers and consumers, connecting data streams across your organization. Apache Kafka and Azure Event Hubs provide such distributed platforms.

How is Azure Event Hubs different from Apache Kafka?

Apache Kafka and Azure Event Hubs are both designed to handle large-scale, real-time stream ingestion. Conceptually, both are distributed, partitioned, and replicated commit log services. Both use partitioned consumer models with a client-side cursor concept that provides horizontal scalability for demanding workloads.

Apache Kafka is an open-source streaming platform which is installed and run as software. Event Hubs is a fully managed service in the cloud. While Kafka has a rapidly growing, broad ecosystem and has a strong presence both on-premises and in the cloud, Event Hubs is a cloud-native, serverless solution that gives you the freedom of not having to manage servers or networks, or worry about configuring brokers.

Announcing Azure Event Hubs for Apache Kafka

We are excited to announce the general availability of Azure Event Hubs for Apache Kafka. With Azure Event Hubs for Apache Kafka, you get the best of both worlds—the ecosystem and tools of Kafka, along with Azure’s security and global scale.

This powerful new capability enables you to start streaming events from applications using the Kafka protocol directly in to Event Hubs, simply by changing a connection string. Enable your existing Kafka applications, frameworks, and tools to talk to Event Hubs and benefit from the ease of a platform-as-a-service solution; you don’t need to run Zookeeper, manage, or configure your clusters.

Event Hubs for Kafka also allows you to easily unlock the capabilities of the Kafka ecosystem. Use Kafka Connect or MirrorMaker to talk to Event Hubs without changing a line of code. Find the sample tutorials on our GitHub.

This integration not only allows you to talk to Azure Event Hubs without changing your Kafka applications, you can also leverage the powerful and unique features of Event Hubs. For example, seamlessly send data to Blob storage or Data Lake Storage for long-term retention or micro-batch processing with Event Hubs Capture. Easily scale from streaming megabytes of data to terabytes while keeping control over when and how much to scale with Auto-Inflate. Event Hubs also supports Geo Disaster-Recovery. Event Hubs is deeply-integrated with other Azure services like Azure Databricks, Azure Stream Analytics, and Azure Functions so you can unlock further analytics and processing.

Event Hubs for Kafka supports Apache Kafka 1.0 and later through the Apache Kafka Protocol which we have mapped to our native AMQP 1.0 protocol. In addition to providing compatibility with Apache Kafka, this protocol translation allows other AMQP 1.0 based applications to communicate with Kafka applications. JMS based applications can use Apache Qpid™ to send data to Kafka based consumers.

Open, interoperable, and fully managed: Azure Event Hubs for Apache Kafka.

Next steps

Get up and running in just a few clicks and integrate Event Hubs with other Azure services to unlock further analytics.

Enjoyed this blog? Follow us as we update the features list. Leave us your feedback, questions, or comments below.

Happy streaming!


          Azure SQL Data Warehouse introduces new productivity and security capabilities      Cache   Translate Page      

SQL Data Warehouse continues to provide a best in class price to performance offering, leading others in TPC-H and TPC-DS benchmarks based on independent testing. As a result we are seeing customers, including more than 50 percent of Fortune 1000 enterprise such as Anheuser Busch InBev, Thomson Reuters, and ThyssenKrupp build new analytics solutions on Azure. 

With the launch of SQL Data Warehouse Gen2 in April 2018, customers have benefited tremendously from query performance and concurrency enhancements. To support our customers’ exponentially growing data volume and resulting analytics workloads, today we are sharing new SQL Data Warehouse features. Enhanced workload management, row-level security, and improved operational experiences.


Azure SQL Data Warehouse

Enhanced workload management

SQL Data Warehouse will offer workload management capabilities that optimize query execution to ensure that high value work gets priority access to system resources. With features such as workload importance, customers can use a single SQL Data Warehouse database to more efficiently run multiple workloads, taking away the complexity of separate data warehouses for each solution. With this new capability, SQL Data Warehouse enables better control, utilization and optimization over deployed resources. Workload importance will be available for all SQL Data Warehouse customers later this year at no additional cost.

I