Next Page: 10000

          Sitecore Plus Stylelabs: Three Takeaways #DAM #wcm      Cache   Translate Page      

Today comes news that longtime Web Content & Experience Management vendor Sitecore will acquire upstart Digital Asset Management vendor Stylelabs.  RSG has covered both vendors since practically their inception in our WCM and DAM evaluation research streams.  Here's three quick takeaways.

1. Architectural Mismatch

Both vendors will stress that they share common roots in Microsoft technologies, but you have to go pretty far up the tree (or down the stack...), to find where they intersect. 

Sitecore is an unusually proprietary (if very powerful), .NET-based platform built for the on-premise world, struggling a bit to cloudify.  This is less of an issue for Sitecore's mostly higher-end licensees who typically want a single-tenant instance they can heavily customize in an IaaS environment anyway.

Stylelabs in contrast was built to run natively in Azure as a SaaS environment.  For better or worse the vendor maximizes available Azure services in lieu of building expansive capabilities itself. Unlike Sitecore, Stylelabs favors tweaking by configuration rather than customization.  It's less extensible, but also less developer-intensive than Sitecore.

2. Welcome News for Stylelabs Customers

Because of the substantial technical differences, Sitecore is likely to leave Stylelabs separate and intact, though licensees of the latter should beware Sitecore salespeople trying to cross-sell a much more complicated WCM environment.

The better news is that, like many DAM vendors, Stylelabs suffers from a somewhat weak institutional base and immature ecosystem.  Sitecore could help them beef up both, with reasonably well-respected (if finicky) tech support operations, and deep experience building global customer and developer ecosystems.

3. Not Great News for Sitecore Licensees

Sitecore has always been DAM-agnostic and steered customers to a handful of favored partners.  In many cases, these other vendors may prove a better match than Stylelabs, particularly for licensees focused mainly on supporting images for websites, where Stylelabs will prove overkill.

Prospective Sitecore licensees will want to be careful about not getting forced to license Stylelabs — recall that Sitecore has a history of overbundling disparate services — and instead make sure they find the best fit for DAM going forward, including potentially simpler solutions that might also present a tighter architectural match.

Conclusion

It's possible that Sitecore has acquired Stylelabs to learn how to transition a legacy codebase to the cloud, much the same way that Microsoft bought Yammer for the latter's SaaS experience, and not the actual microblogging technology, which Redmond never really respected.  Also, Sitecore wants to compete with Adobe, and Adobe sells a DAM, albeit a not very powerful one.

Sitecore could also see an opportunity where a more object-oriented DAM like Stylelabs could compete against content marketing vendors in the fast-emerging space of upstream, omnichannel content collaboration.  It's revealing that the Sitecore press release calls Stylelabs a "Content Marketing Vendor." Pro tip: Stylelabs is not a content marketing vendor, yet...

More likely, I think, in a DAM marketplace craving more mature vendors, Stylelabs figured out the only way to truly grow up was to attach to an existing adult, and Sitecore proved the most willing adoptive parent.

So, as a Stylelabs licensee, this is a good thing; for Sitecore customers, it's a mixed bag.

RSG licensees with "plus" subscriptions our DAM or WCM streams are welcome to schedule an advisory call (login required) to explore in more detail.

 


          Microsoft Azure Stack (70-533) Study Guide - 70-533 Exam Dumps      Cache   Translate Page      
By Charlotteava    In Education    2 hours ago
You cannot prepare well by collecting information from different sources by yourself. You need to choose a valid source and have to make expertly directed effort. We have hired qualified experts at RealExamCollection.com to help out students in their IT exam preparation. They have carved Microsoft 70-533 braindumps solely to assist you in your studies while preparing for your exam. You will get every necessary help at this platform. A series of questions and answers is included with an intention to impart precise knowledge among IT students. There is no chance of failure if you go forward by following the directions of our experts. 70-533 exam dumps material is a complete package of services which help to improve your preparation and boost your performance. By downloading this valid guidebook you get the guarantee for your success. https://www.realexamcollection.com/microsoft/70-533-dumps.html
Tags: microsoft, azure, stack, 70-533, dumps, exam, braindumps, study, guide

          Hybrid - Cloud Infrastructure (IaaS) Engineer/Analyst - Underwriters Laboratories - Laramie, WY      Cache   Translate Page      
Azure experience (OMS, Application Insights, Automation, ARM templates, etc.). Contribute to a Safer, More Secure, and More Sustainable World....
From Underwriters Laboratories - Wed, 11 Jul 2018 22:56:33 GMT - View all Laramie, WY jobs
          Hybrid - Cloud Infrastructure (IaaS) Engineer/Analyst - UL LLC - Laramie, WY      Cache   Translate Page      
Azure experience (OMS, Application Insights, Automation, ARM templates, etc.). Contribute to a Safer, More Secure, and More Sustainable World....
From UL LLC - Wed, 11 Jul 2018 21:17:04 GMT - View all Laramie, WY jobs
          Microsoft Invests In Southeast Asia'a Grab      Cache   Translate Page      
Redmond-based Microsoft is making an investment in Southeast Asia on-demand transportation Grab, the two companies said late Monday evening, in a deal where Grab will adopt Microsoft Azure as its "preferred cloud" platform. Size of the investment was not announced.... (more)
          Serverless cloud computing: Don’t go overboard      Cache   Translate Page      

There are lots of big cloud shows coming up, and the core themes will be containers, devops integration, and more serverless computing services, such as databases, middleware, and dev tools.

Why the focus on serverless computing? It’s a helpful concept, where you don’t have to think about the number of resources you need to attach to a public cloud service, such as storage and compute. You just use the service, and the back-end server instances are managed for you: “magically” provisioned, used, and deprovisioned.

The serverless cloud computing concept is now white-hot in the cloud computing world, and the cloud providers are looking to cash in. Who can blame them? At the same time, you can take things to a silly level. I suspect there’ll be a few serverless concepts that jump the shark the day they are announced.

To read this article in full, please click here


          Microsoft готує власний стрімінговий сервіс – Project xCloud      Cache   Translate Page      
Більшість провідних компаній вірять в успіх потокових сервісів та хмарного геймінгу. Такі проекти як GeForce Now від NVIDIA та PlayStation Now від Sony доступні вже давно, а тепер аналогічний сервіс Project Stream анонсувала Google та Microsoft представила свій Project xCloud.

Про це повідомляє Gamespot.

Нова розробка під назвою Project xCloud дозволить запускати ігри з Xbox One на ПК та інших консолях. Наразі триває закрите тестування нового сервісу, а наступного року технологія стане доступна для масового ринку.

Для найбільш комфортної гри знадобиться бездротовий контролер від оригінальної Xbox One, що підключається за допомогою Bluetooth. У разі відсутності геймада, йдеться про підтримку сенсорних елементів управління.

Технологія Project xCloud заснована на хмарній платформі Microsoft Azure з використанням спеціальних серверів, близьких за апаратними і програмними характеристиками до консолі Xbox One. До слова, датацентри Microsoft Azure розташовані більш ніж в 140 країнах, тому для успішного впровадження сервісу Project xCloud вже є готова база.

Презентація Project xCloud – дивіться відео

 

На даний момент, мінімальна швидкість інтернет-з'єднання заявлена ​​на рівні 10 Мбіт / с. Крім того, однією з умов первісного техзавдання є забезпечення працездатності технології Project xCloud в існуючих 4G-мережах.

Більше новин, що стосуються подій зі світу технологій, ґаджетів, штучного інтелекту, а також космосу читайте у розділі Техно


          big data analytics      Cache   Translate Page      
Need an experienced person in NiFi , Hive and Azure with some work experience around Hadoop. Have to work on daily basis max 2 hours a day. Also good to have some experience in Kafka,Spark. (Budget: $15 - $25 USD, Jobs: Azure, Hadoop, Hive)
          Azure CLOUD SPECIALIST - Stefanini - Genk      Cache   Translate Page      
We are looking for Azure specialists who on the short term can help setting up Azure environments and at the same time, can assist to a final design and prepare...
Van Indeed - Thu, 09 Aug 2018 13:45:34 GMT - Toon alle vacatures in Genk
          Netcool Consultant - Remote      Cache   Translate Page      
NY-New york, Job Description: Remote Project IMPORTANT: -6 month Right to Hire will go into the work order -ok to use subvendors RESPONSIBILITIES: Onsite lead for development team REQUIRED SKILLS: ASP.Net MVC 5 Azure DevOps, CI/CD GIT Repo EF : Code First Migration JavaScript/JQuery Azure Application Insights/ WebApps /Stream Analytics SQL Server
          big data analytics      Cache   Translate Page      
Need an experienced person in NiFi , Hive and Azure with some work experience around Hadoop. Have to work on daily basis max 2 hours a day. Also good to have some experience in Kafka,Spark. (Budget: $15 - $25 USD, Jobs: Azure, Hadoop, Hive)
          big data analytics      Cache   Translate Page      
Need an experienced person in NiFi , Hive and Azure with some work experience around Hadoop. Have to work on daily basis max 2 hours a day. Also good to have some experience in Kafka,Spark. (Budget: $15 - $25 USD, Jobs: Azure, Hadoop, Hive)
          Microsoft anuncia o Windows Virtual Desktop no Azure!      Cache   Translate Page      
A Microsoft anuncia o Windows Virtual Desktop no Azure! Durante a Ignite 2018, na semana passada, a Microsoft revelou o Windows Virtual Desktop, a melhor experiência virtualizada do Windows e do Office oferecida no Azure. O Windows Virtual Desktop é o único serviço baseado na nuvem que oferece uma experiência do Windows 10 para vários usuários, otimizado […]
          Consultant - Azure - Microsoft - Phoenix, AZ      Cache   Translate Page      
You will engage with senior-level technical and business decision makers, focusing on empowering our customers to successfully pursue their digital...
From Microsoft - Tue, 25 Sep 2018 04:45:00 GMT - View all Phoenix, AZ jobs
          Arizona Welcomes a New Community of Desert Dwellings      Cache   Translate Page      
The long-awaited Azure Paradise Valley will officially open on October 20.
          Consultant - IT Systems Engineer - Valencia IIP Advisors - Ottawa, ON      Cache   Translate Page      
The Microsoft O365 and Azure Engineer position is a critical role within Valencia for the Service Integration team, providing Tier 2/3 support and Consulting to...
From Indeed - Tue, 09 Oct 2018 20:43:12 GMT - View all Ottawa, ON jobs
          New Zealand Electronic Card Retail Sales (YoY): 5.7% (September) vs previous 6.3%      Cache   Translate Page      

          New Zealand Electronic Card Retail Sales (MoM) came in at 1.1%, above forecasts (0.6%) in September      Cache   Translate Page      

          S&P500 Technical Analysis: Stocks consolidating above 2,877.00 support      Cache   Translate Page      
  • The S&P500 is trading in a bull trend. 
  • The S&P500 broke below the bull trendline and the 200-period simple moving average on the 4-hour chart but is being supported above 2,877.00 (January swing high).
  • As long as the market holds above the support bulls will try to create a new bull leg towards 2,900.00 figure and 2,917.00 (August 29 high). 

S&P500 4-hour chart

Spot rate:                  2,889.00
Relative change:      -0.04%     
High:                         2,891.50
Low:                          2,863.00

Main trend:               Bullish

Resistance 1:           2,900.00 figure
Resistance 2:           2,917.00 August 29 high
Resistance 3:           2,939.50 all-time high
Resistance 4:           2,950.00, 161.8% Fibonnacci extension (Aug-Sept, high/low)
Resistance 5:           3,000.00 round figure

Support 1:                2,877.00 January swing high
Support 2:                2,863.75 August 7 high
Support 3:                2,853.00 August 9 low


          US Dollar Index Technical Analysis: DXY soon on life support if bulls don’t break above 96.00 figure      Cache   Translate Page      
  • The US Dollar Index is trading in a bull trend above the 50, 100 and 200-period simple moving average. Although the SMA are starting to flatten which is a sign of bullish weakness. 
  • DXY bulls are having a hard time to conclusively break above 95.65-96.00 zone. The long tails on top of the last bars are another sign of bullish exhaustion. The Stochastic indicator is already in overbought condition while the RSI is slowly weakening. 
  • All-in-all, bulls will more likely need to step in and bring the market well above 95.65-96.00 or else bears will take the lead an try to drive it down towards 95.00 figure. 

DXY daily chart

Spot rate:                 95.74
Relative change:     -0.07%
High:                        95.68
Low:                         95.64

Trend:                     Bullish

Resistance 1:         95.65 July 19 high (key level)
Resistance 2:         96.41 August 20 high
Resistance 3:         97.00 current 2018 high


Support 1:               95.52 August 6 high
Support 2:               95.24 July 13 high
Support 3:               95.00 figure
Support 4:               94.91 July 27 high 
 


          Crude Oil WTI Technical Analysis: Black Gold grinder higher must surpass $75.19 a barrel for further advances      Cache   Translate Page      
  • Crude oil is trading in a bull trend as the market is evolving above the 50, 100 and 200-period simple moving average.
  • Crude oil is grinding higher but should ideally break 75.19 (October 5 high) and $76.00 a barrel to prevent the head-and-shoulders pattern to come into play. The RSI is above 50 while the Stochastic indicator is already in overbought condition. 
  • Failure to break above 75.19 can lead to a rotation down back to 74.00.   

Crude oil WTI 4-hour chart

Rate:                         74.84
Relative change:       0.88%     
High:                         75.25
Low:                          74.03

Main Trend:               Bullish

Resistance 1:           75.19 October 5 high
Resistance 2:           75.88 intraday swing high
Resistance 3:           76.00 figure
Resistance 4:           77.00 figure
Resistance 5:           77.83 November 21, 2014 high
Resistance 6:           80.00 round figure

Support 1:                74.00 figure 
Support 2:                73.00 figure
Support 3:                72.00 figure
Support 4:                71.45 September 26 low
Support 5:                70.53 May 24 low


          AUD/USD Technical Analysis: Aussie bulls back in the game looking at 0.7200 figure      Cache   Translate Page      
  • AUD/USD is trading in a bear trend as it is evolving below its 50, 100 and 200-period simple moving average on the 4-hour chart. 
  • AUD/USD found support near 0.7050 as the bulls created a lower low with the September low. The RSI and the Stochastic indicators are trading above the 50 line while the MACD is turning bullish. While the main trend is bearish there is room for a bullish leg to the upside.
  • Upside targets can be located near 0.7144 (September 5 low) and 0.7200 figure. A daily close below 0.7041 (October low) would likely invalidate the current bullish bias.

AUD/USD 4-hour chart

Spot rate:                 0.7097
Relative change:      0.29%     
High:                        0.7103
Low:                         0.7054

Main trend:              Bearish
Short-term trend:     Bullish

Resistance 1:          0.7085, September 11 low
Resistance 2:          0.7144 September 5 low
Resistance 3:          0.7200 figure August 15 low

Support 1:               0.7041 October low
Support 1:               0.7000 figure
Support 2:               0.6830 January 15, 2016 low 


          GBP/USD Technical Analysis: Cable shorts getting squeezed as bulls break above 1.3100 and now target 1.3200 figure      Cache   Translate Page      
  • GBP/USD is trading in a bull trend as it is evolving above its 50, 100 and 200-period.
  • GBP/USD broke above a bull flag (blue lines) and now broke above the 1.3100 figure. The picture remains bullish on GBP/USD as long as the market trade above 1.2957-1.3000 figure. 
  • Bulls are now closer to the 1.3200 target as 1.3300 will become the next challenge for buyers.

GBP/USD 4-hour chart

Spot rate:                         1.3138
Relative change:              0.38%     
High:                                1.3149
Low:                                 1.3033

Main trend:                      Bullish

Resistance 1:                  1.3150 September 21 low
Resistance 2:                  1.3200 figure
Resistance 3:                  1.3300 figure

Support 1:                      1.3100 figure
Support 2:                      1.3050 August 30 swing high, key level
Support 3:                      1.3028 October 8 low
Support 4:                      1.3000 figure 
Support 5:                      1.2957 July 19 swing low  
Support 6:                      1.2900 figure 


          United States 52-Week Bill auction climbed from previous 2.465% to 2.58%      Cache   Translate Page      

          United States 4-Week Bill Auction up to 2.135% from previous 2.105%      Cache   Translate Page      

          USD/CHF Technical Analysis: A pullback down below 0.9950 can be imminent      Cache   Translate Page      
  • USD/CHF is trading in a bull trend as the 50 and 100-period simple moving averages (SMA) are evolving upward. The 100 SMA crossed above the 200 SMA which is seen as a bullish clue known as a golden cross. 
  • USD/CHF is finding resistance below the 0.9950 level as the MACD is bearish. 
  • In the absence of a breakout above 0.9950, USD/CHF is poised to trade sideways to down towards 0.9891 (October 4 low) and 0.9868 (July 31 low).

USD/CHF 4-hour chart

Spot rate:                       0.9923
Relative change:            0.04%     
High:                              0.9955
Low:                               0.9921

Main trend:                    Bullish

Resistance 1:                0.9950 figure
Resistance 2:                1.0000 parity level
Resistance 3:                1.0068 July 13 high

Support 1:                     0.9891 October 4 low
Support 2:                     0.9868 July 31 low
Support 3:                     0.9820 August 25 low
Support 4:                     0.9807 August 22 low 
Support 5:                     0.9788 June 7 swing low (key level)
Support 6:                     0.9768 September 4 swing high


          Technical Specilist - Brillio - Redmond, WA      Cache   Translate Page      
Strong experience in Azure ecosystem such as HDInsight, Azure Data Factory, Azure Data Lake and SQL DW. Job description would be as follows:....
From Brillio - Tue, 25 Sep 2018 23:59:03 GMT - View all Redmond, WA jobs
          MS BI Developer - Advantine Technologies - Redmond, WA      Cache   Translate Page      
Experience in Azure ecosystem such as HDInsight, Azure Data Factory, Azure Data Lake and SQL DW is OPTIONAL, but NOT mandatory Qualifications....
From Advantine Technologies - Wed, 19 Sep 2018 17:21:18 GMT - View all Redmond, WA jobs
          The cross-sectional association between chemerin and bone health in peri/pre and postmenopausal women: results from the EPIC-Potsdam study      Cache   Translate Page      
imageObjective: Recent in vitro data suggested that the novel adipokine chemerin may influence bone health. However, only limited evidence of the relationship between chemerin and bone health in humans is available. Therefore, this study aimed to investigate the association between chemerin and broadband ultrasound attenuation (BUA) in peri/premenopausal and postmenopausal women. Methods: Data from the German population-based European Prospective Investigation into Cancer and Nutrition-Potsdam cohort comprising 404 peri/premenopausal and 279 postmenopausal women were analyzed. Multivariable-adjusted analysis of covariance including age, body mass index, waist circumference, smoking status, education, physical activity, alcohol consumption, and hormone use was used to investigate potential relationships between the adipokine and BUA levels in peri/premenopausal and postmenopausal women, respectively. Results: The concentrations of chemerin were lower in peri/premenopausal women (median 118.0 ng/mL, interquartile range [IQR] 99.2-135.0), compared with postmenopausal women (median 140.0 ng/mL, IQR 121.0-167.0). In peri/premenopausal women chemerin was inversely associated with BUA levels; after multivariable adjustment, a 10% increase in the chemerin concentration was significantly associated with 0.83 dB/MHz lower BUA levels (P = 0.0006). In postmenopausal women chemerin was not related to BUA levels (P = 0.8). Conclusion: The present study provides evidence for an inverse association between chemerin and BUA in peri/premenopausal women. Therefore, the study suggests that high chemerin concentrations may minimize peak bone mass and thereby may promote age-related bone loss. Further studies are needed to investigate the role of chemerin in bone homeostasis in peri/premenopausal and postmenopausal women.
          Epicardial Surgical Ligation of the Left Atrial Appendage Is Safe, Reproducible, and Effective by Transesophageal Echocardiographic Follow-up      Cache   Translate Page      
imageObjective The left atrial appendage (LAA) is the source of 90% of thrombi in patients with atrial fibrillation. Our double LAA ligation (LLAA) technique was shown to be 96% successful in a small study. However, the outcomes of these patients have yet to be compared with a set of nonligated patients. Methods From 2005 to 2012, a total of 808 patients received LAA using our double ligation technique using both a polydioxanone (PDS) II endosnare and a running 4-0 Prolene pledgeted suture. The 30-day outcomes of these patients were compared with that of nonligated patients. Fifty-six of the ligated patients had a postoperative transesophageal echocardiography (TEE). An echocardiographer reviewed the follow-up TEEs for LAA remnant and/or residual flow into the LAA using color Doppler imaging. The patients with LAA flow and/or remnant depth of 1 cm or greater were deemed to have an unsuccessful exclusion. Results The ligated group had a trend of less postoperative atrial fibrillation (19.4% vs 22.9%, P = 0.07) and an overall significantly lower in-hospital mortality (0.7% vs 3.0%, P < 0.001) and lower 30-day mortality (0.7% vs 3.4%, P < 0.0001). The LAA was successfully excluded in 53 (94.7%) of the 56 patients with TEE. Conclusions Double LAA ligation correlates with lower rates of in-hospital and 30-day mortality. This advantage comes without an increase in perioperative complications. This technique can easily be performed off or on pump, is very reproducible, and comes at a very low cost compared with LAA occlusion devices. Stroke has a multifactorial etiology; successful LLAA removes one potential source of thrombi perioperatively and in the long-term.
          Skomentuj Jestem dezerterem. A pan nie jest?, którego autorem jest Upadek prawicowych mediów? - Kultura Liberalna      Cache   Translate Page      
[…] jakiego chcą, mają w domu internet – naprawdę! – i telewizje też oglądają różne”, mówi Robert Mazurek, dziennikarz prasowy i radiowy, prowadzący poranne rozmowy polityczne w radiu … W rozmowie z Łukaszem Pawłowskim twierdzi również, że gdyby wpływ mediów na decyzje wyborcze […]
          Comment on SSMS 18.0 public preview released by Jayendran Arumugam      Cache   Translate Page      
Hi Dinakar, The always encryption is not working in this new version SSMS 18 preview 4. I've logged a bug at https://feedback.azure.com/forums/908035-sql-server/suggestions/35664979-always-encryption-with-azurekeyvault-not-working-i Could you please look into this issue Thanks, Jay
          Lote 29 e 29A – Terreno – Azurém      Cache   Translate Page      
100000
Lote de terreno para construção de moradia tipologia V3, com área total de terreno 225,5 m2, área de implantação do edifício 81 m2, área bruta de construção 162 m2, dependente 0,0 m2. Localizada na freguesia de Azurém, (Guimarães/Braga) está 2 km...
2252 m2 44 EUR/m²
Thu, 11 Jan 2018 13:19:51 -0500
          Solution Architect - Data & Analytics - Neudesic LLC - Seattle, WA      Cache   Translate Page      
Azure(SQL Database, DocumentDB, SQL Data Warehouse, Table Storage, Redis Cache, Database Migration Wizard, HDInsight, Data Factory, Stream Analytics, Data Lake...
From Neudesic LLC - Mon, 02 Jul 2018 10:04:49 GMT - View all Seattle, WA jobs
          What are Durable Functions?      Cache   Translate Page      

Oh no! Not more jargon! What exactly does the term Durable Functions mean? Durable functions have to do with Serverless architectures. It’s an extension of Azure Functions that allow you to write stateful executions in a serverless environment.

Think of it this way. There are a few big benefits that people tend to focus on when they talk about Serverless Functions:

  • They’re cheap
  • They scale with your needs (not necessarily, but that’s the default for many services)
  • They allow you

The post What are Durable Functions? appeared first on CSS-Tricks.


          Project xCloud: Η Game Streaming υπηρεσία της Microsoft      Cache   Translate Page      

Το streaming παιχνιδιών γίνεται ολοένα και πιο έντονο, με την Microsoft να ανακοινώνει το Project xCloud, που υπόσχεται gaming επιπέδου κονσολών σε συσκευές όπως smartphones. Για την επίλυση προβλημάτων όπως καθυστέρηση (latency) και γραφική ποιότητα, έρχεται στο παιχνίδι το Azure cloud της εταιρίας. Ήδη έχει δοκιμαστεί σε κλειστή beta φάση, συνδέοντας ένα παραδοσιακό Xbox χειριστήριο […]

The post Project xCloud: Η Game Streaming υπηρεσία της Microsoft appeared first on Busted.gr.


          Project xCloud не приведет к отказу от обычных игровых консолей      Cache   Translate Page      
Вчера компания Microsoft представила свой новый сервис - Project xCloud, который позволит запускать игры Xbox на любом устройстве, вплоть до Android смартфонов. Сервис будет обеспечиваться мощностями облачных серверов Microsoft Azure.

Тем не менее, Microsoft не планирует покидать рынок обычных игровых приставок. Не исключено, что выйдет отдельная версия консоли для запуска через облако игр на телевизоре, про которую ходили слухи, но и привычные приставки Xbox никуда не пропадут. Об этом сообщил Карим Чоудри, который является ответственным за разработку и внедрение сервиса Project xCloud, в рамках интервью ресурсу Wired:
Игра в облаке не заменит привычный опыт игры. Мы сможем продолжать играть на приставках, как это происходит и сейчас. Облако не заменит привычные диски и цифровые версии игр.

          November 2012 Chicago IT Architects Group Meeting Recap      Cache   Translate Page      

Originally posted on: http://tostringtheory.com/archive/2012/11/21/november-2012-chicago-it-architects-group-meeting-recap.aspx

So the year is coming to an end.  A hearty few came out two days before Thanksgiving to discuss adopting agile in the enterprise.  While Norm Murrin claimed to be nervous about talking in front of a group your wouldn’t have known by his presentation.  He really made a topic that has always been hard to relate very personal.  This lead to some great discussion.  I came out of looking for ways to investigate agile further.  His presentation can be found here.

This was our last meeting for the year.  We are looking forward to next year and are starting to line up some speakers and topics.  At this point we have an Azure presentation coming in February and are ironing out talks for January and March.  If your would like to join us and have topics you would like to see presented contact me through this blog.  Either leave a comment here or use the contact page.  I would love to hear from you.

Have a great holiday season and we will see you next year.


          Accelerate Azure infrastructure modernization deployments      Cache   Translate Page      
Do you need assistance to get started on your cloud infrastructure and management technical journey? Look no further! With the two technical webinars listed below, you can expand your technical knowledge of Azure services with guidance from Microsoft experts. You’ll walk away with the technical aptitude needed to have high-value conversations with customers. Introduction to...
          Time To Reboot      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2018/04/05/time-to-reboot.aspx

Image result for reboot

It has been nearly six months since I blogged last.  Thankfully this means I have been busy working on client projects.  It is a new year and just back from spring break so I think it is time to start digging into technical topics again.  This post will actually help to do some testing for an Azure Logic App POC that I am working on.  Watch for a future post on this and other Azure topics. Until then …


          Azure Functions Visual Studio 2017 Development      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2017/08/10/azure-functions-visual-studio-2017-development.aspx

Image result for azure functions logo

The development tools and processes for Azure Functions are ever changing.  We started out only being able to create a function through the portal which I did a series on.  We then got a template in VS2015, but it really didn’t work very well.  They have since been able to create functions as Web Application libraries and now we are close the the release of a VS2017 template.

This post will walk through the basics of using the VS2017 Preview with the Visual Studio Tools For Azure Functions which you can download here.

Create New Project

To create the initial solution open up the New Project dialog and find the Azure Function project type, name your project and click OK.

image_thumb10

Create New Function

To add a function to your project, right-click the project and select New Item.  In the New Item dialog select Azure Function and provide a name for the class and click Add. 

image_thumb12

The next dialog which will appear is the New Azure Function dialog.  Here you will select the function trigger type and its parameters.  In the example below a timer trigger has been selected and a Cron schedule definition is automatically defined to execute every 5 minutes.

Also in this dialog you can set the name of the function.  When you compile a folder will be created with that name in you bin directory which will be used later for deployment.

image_thumb14

Add Bindings

With each generation of Azure Function development the way you initially define bindings changes (even if they stay the same behind the scenes).  Initially you had to use the portal Integrate page.  This had its advantages.  It would visually prompt you for the type of binding and the parameters for that binding.

With the Visual Studio template you have to add attributes to the Run method of your function class.  This requires that you know what the attribute names are and what parameters are available and their proper values.  You can find a list of the main binding attributes here.

At compile time the attributes will be used to generate a function.json file with your trigger and bindings definition.

Add NuGet Packages

If you are building functions in the portal you have to create a projects.json file that defines the packages you want to include.  This requires that you know the format of the file.  Thankfully with the Visual Studio template you can use the normal Nuget Package manager.

Deploying

There are a couple of ways to deploy your solution.  In the end a Function App is a specialized App Service.  This means you have the same deployment options of Visual Studio, PowerShell or via VSTS continuous deployment.  The main difference is that you don’t have a web.config file and have to manage you app settings and connection strings through the portal.  This can be reached by following the Application Settings link under the Configured Features section of the Function App Overview page.

image

Summary

While creating Azure Functions still isn’t a WYSIWYG turn key process the latest incarnation gives us an ALM capable solution.  I believe this is the development approach that will stabilize for the foreseeable future and anyone who is creating Functions should invest in learning.


          Query Application Insights REST API To Create Custom Notifications      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2017/08/04/query-application-insights-rest-api-to-create-custom-notifications.aspx

Image result for azure application insights logo

Application Insights is one of those tools that has been around for a number of years now, but is finally getting understood as more companies move to Azure as a cloud solution.  It has become an amazing tool for monitoring the performance of your application, but it can also work as a general logging platform as I have posted before.

Now that you are capturing all this information how can you leverage it?  Going to the Azure portal whenever you want an answer is time consuming.  It would be great if you could automate this process.  Of course there are a number of metrics that you can create alerts for directly via the portal, but what if you want a non-standard metric or want to do something beside just send an alert?

Fortunately Microsoft has a REST API in beta for Application Insights.  It allows you to check standard metrics as well as run custom queries as you do in the Analytics portal.  Let’s explore how to use this API.

In this post will show how to create a demo that implements an Azure Function which calls the Application Insights REST API and then send the results out using SendGrid.  I created them with the VS2017 Preview and the new Azure Functions templates.

Generate Custom Events

First we need some data to work with.  The simplest way is to leverage the TrackEvent and TrackException method of the Application Insights API.  In order to do this you first need to setup a TelemetryClient.  The code below I have as part of the class level variables.

        private static string appInsightsKey = System.Environment.GetEnvironmentVariable("AppInsightKey", EnvironmentVariableTarget.Process);
        private static TelemetryClient telemetry = new TelemetryClient();
        private static string key = TelemetryConfiguration.Active.InstrumentationKey = appInsightsKey; //System.Environment.GetEnvironmentVariable("AN:InsightKey", EnvironmentVariableTarget.Process);

After that it is simple to call the TrackEvent method on the TelemetryClient object to log an activity in your code (be aware it may take 5 minutes for an event to show up in Application Insights).

            telemetry.TrackEvent($"This is a POC event");

Create a VS2017 Function Application

I will have another post on the details in the future, but if you have Visual Studio 2017 Preview 15.3.0 installed you will be able to create an Azure Functions project.

image

Right click the project and select the New Item context menu option and select Azure Function as shown below.

image

On the New Azure Function dialog select TimerTrigger and leave the remaining options as default.

image

Call Application Insights REST API

Once there are events in the customEvents collection we can write a query and execute it against the Application Insights REST API.  To accomplish this the example uses a simple HttpClient call.  The API page for Application Insights can be found here and contains the ULRs and formats for each call type.  We will be using the Query API scenario which will be setup with a couple of variables.

        private const string URL = "https://api.applicationinsights.io/beta/apps/{0}/query?query={1}";
        private const string query = "customEvents | where timestamp >= ago(20m) and name contains \"This is a POC event\" | count";

The call to the service is a common pattern using the HttpClient as shown below.  Add this to the Run method of your new function.

            HttpClient client = new HttpClient();
            client.DefaultRequestHeaders.Accept.Add(
                new MediaTypeWithQualityHeaderValue("application/json"));
            client.DefaultRequestHeaders.Add("x-api-key", appInsightsApiKey);
            var req = string.Format(URL, appInsightsId, query);
            HttpResponseMessage response = client.GetAsync(req).Result;

Process Results

After we have a result we can deserialize the JSON using JSON.NET and send it to our support team via SendGrid.  You will have to add the NuGet package Microsoft.Azure.WebJobs.Extensions.SendGrid.

Modify the signature of your function’s Run method to match the code sample shown here.  In this example “message” is defined as an output variable for the Azure Function which is defined as a binding by using the SendGrid attribute. 

        public static void Run([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, TraceWriter log, [SendGrid(ApiKey = "SendGridApiKey")]out Mail message)

We will also need a structure to deserialize the returned JSON message into. If you look at the message itself it can appear rather daunting but it breaks down into the following class structure.  Create a new class file and replace the default class with this code.

    public class Column
    {
        public string ColumnName { get; set; }
        public string DataType { get; set; }
        public string ColumnType { get; set; }
    }

    public class Table
    {
        public string TableName { get; set; }
        public List<Column> Columns { get; set; }
        public List<List<object>> Rows { get; set; }
    }

    public class RootObject
    {
        public List<Table> Tables { get; set; }
    }

The last code example below performs the deserialization and creates the SendGrid email message.  Insert this to the Run method after the HttpClient call we previously added.

                string result = response.Content.ReadAsStringAsync().Result;
                log.Info(result);

                RootObject aiResult = JsonConvert.DeserializeObject<RootObject>(result);

                string countString = aiResult.Tables[0].Rows[0][0].ToString();

                string recipientEmail = System.Environment.GetEnvironmentVariable($"recipient", EnvironmentVariableTarget.Process);
                string senderEmail = System.Environment.GetEnvironmentVariable($"sender", EnvironmentVariableTarget.Process);

                var messageContent = new Content("text/html", $"There were {countString} POC records found");

                message = new Mail(new Email(senderEmail), "App Insights POC", new Email(recipientEmail), messageContent);

Publish your solution to an Azure Function App by downloading the Function App’s profile and using the VS2017 projects publish options.  You will also need to define the application settings referred to in the code so that they are appropriate for you environment.  At that point you will be able to observe the results of you efforts.

Summary

This post demonstrates how a small amount of code can give you the ability to leverage Application Insights for more than just out of the box statistics alerts.  This approach is flexible enough to be use for report on types of errors and monitoring if subsystems are remaining available.  Combining the features within Azure’s cloud offerings gives you capabilities that would cost much more in development time and resource if they were done on premises. 

My only real problem with this approach is that I would prefer to be accessing values in the result by name rather than indexes because this makes the code less readable and more brittle to changes.

Try these examples out and see what other scenarios they apply to in your business.


          Logging To Application Insights In Azure Functions      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2017/02/16/logging-to-application-insights-in-azure-functions.aspx

In my last post I covered logging in Azure Functions using TraceWriter and log4net.  Both of these work, but Application Insights rolls all your monitoring into one solution, from metrics to tracking messages.  I have also heard a rumor that in the near future this will be an integrated part of Azure Functions.  Given these factors it seem wise to start give it a closer look.

So how do you take advantage of them right now?  If you go to GitHub there is a sample written by Christopher Anderson, but let me boil this down.  First we need to create an Application Insight instance and grab the instrumentation key.

When I created my Application Insight instance I chose the General application type and the same resource group as my function app.

image

Once the instance has been allocated you will need to go into the properties blade.  There you will find a GUID for the Instrumentation Key.  Save this off so that we can use it later.

You then need to add the Microsoft.ApplicationInsights NuGet package by creating a project.json file in your function.  Insert the following code in the new file and save it.  If you have your log window open you will see the package being loaded.

 {   
  "frameworks": {   
   "net46":{   
    "dependencies": {   
     "Microsoft.ApplicationInsights": "2.1.0"   
    }   
   }   
   }   
 }  

In the sample code read.me it says that you need to add a specific app setting, but as long as your code reads from the appropriate setting that is the most important part.  Take the Instrumentation Key that you saved earlier and place it in the app settings.  In my case I used one called InsightKey.  

Next setup your TelemetryClient object like the code here by creating global static variables that can be used throughout your application.  After that we are ready to start tracking our function. 

 private static TelemetryClient telemetry = new TelemetryClient();   
 private static string key = TelemetryConfiguration.Active.InstrumentationKey = System.Environment.GetEnvironmentVariable("InsightKey", EnvironmentVariableTarget.Process);  

To track and event or an exception simply call the appropriate method.  I prefer to encapsulate them in their own methods where I can standardize the usage.  I have added the function name, method name, and context ID from the function execution to make it easier to search and associate entries.

 private static void TrackEvent(string desc, string methodName)   
 {   
   telemetry.TrackEvent($"{FunctionName} - {methodName} - {contextId}: {desc}");   
 } private static void TrackException(Exception ex, string desc, string methodName)   
 {   
   Dictionary<string,string> properties = new Dictionary<string,string>() {{"Function",FunctionName}, {"Method",methodName}, {"Description",desc}, {"ContextId",contextId}};   
   telemetry.TrackException(ex, properties);   
 }  

Analytics

This isn’t an instant answer type of event store.  At the very least there is a few minute delay your application logging and event or exception and when it is visible in the Analytics board.

Once you are logging and sending metrics to Application Insights you need to read the results.  From your Application Insight main blade click on the Analytics button at the top of the overview.  It will open a new page that resembles what you see below.

image

Click the new tab button at the top next to the Home Page tab.  This will open a query window. The query language has a similar structure to SQL, but that is about as far as it goes.

The table objects are listed on the left navigation with the fields listed as you expand out each table.  Fortunately intellisense works pretty well in this tool.  You have what would normally be considered aggregate functions that make life easier.  As you can see below you can use the contains syntax that acts similar to a SQL like comparison.  There are also date range functions like the ago function used below.  I found that these two features can find most things you are looking for.

image

Summary

This posted didn’t cover a lot of the native functionality in Application Insight, but hopefully it gives you a starting point to instrument your Azure Functions.  The flexibility of this tool along with it the probability of it being built into Functions in the future make it a very attractive option.  Spend some time experimenting with it and I think you find it will pay dividends.


          Implementing Logging In Azure Functions      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2017/02/13/implementing-logging-in-azure-functions.aspx

image

Logging is essential to the support of any piece of code.  In this post I will cover two approaches to logging in Azure Functions: TraceWriter and log4net.

TraceWriter

The TraceWriter that is available out of the box with Azure Functions is a good starting point.  Unfortunately it is short lived and only 1000 messages are kept at a maximum and at most they are held in file form for two days.  That being said, I would not skip using the TraceWriter.

Your function will have a TraceWriter object passed to it in the parameters of the Run method.  You can use the Debug, Error, Fatal, Info and Warn methods to write different types of messages to the log as shown below.

log.Info($"Queue item received: {myQueueItem}");

Once it is in the log you need to be able to find the messages.  The easiest way to find the log files is through Kudu.  You have to drill down from the LogFiles –> Application –> Functions –> Function –> <your_function_name>.  At this location you will find a series of .log files if you function has been triggered recently.

image

The other way to look at your logs is through Table Storage via the Microsoft Azure Storage Explorer.  After attaching to your account open the storage account associated with your Function App.  Depending on how you organized your resource groups you can find the storage account by looking at the list of resources in the group that the function belongs to.

Once you drill down to that account look for the tables named AzureWebJobHostLogsyyyymm as you see below.

image

Opening these tables will allow you to see the different types of log entries saved by the TraceWriter.  If you filter to the partition key “I” you will see the entries your functions posted.  You can further filter name and date range to identify specific log entries.

image

log4net

If the default TraceWriter isn’t robust enough you can implement logging via a framework like log4net.  Unfortunately because of the architecture of Azure Functions this isn’t as easy as it would be with a normal desktop or web application.  The main stumbling block is the lack of ability to create custom configuration sections which these libraries rely on.  In this section I’ll outline a process for getting log4net to work inside your function.

The first thing that we need is the log4net library.  Add the log4net NuGet package by placing the following code in the project.json file.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "log4net": "2.0.5"
      }
    }
   }
}

To get around the lack of custom configuration sections we will bind a blob file with your log4net configuration.  Simply take the log4net section of and save it to a text file.  Upload that to a storage container and bind it to your function using the full storage path.

image

Add the references to the log4net library and configure the logger.  Once you have that simply call the appropriate method on the logger and off you go.  A basic sample of the code for configuring and using the logger is listed below.  In this case I am actually using a SQL Server appender.

using System;
using System.Xml;
using log4net;

public static void Run(string input, TraceWriter log, string inputBlob)
{
    log.Info($"Log4NetPoc manually triggered function called with input: {input}");
    log.Info($"{inputBlob}");

    XmlDocument doc = new XmlDocument();
    doc.LoadXml(inputBlob);
    XmlElement element = doc.DocumentElement;

    log4net.Config.XmlConfigurator.Configure(element);

    ILog logger = LogManager.GetLogger("AzureLogger");

    logger.Debug($"Test log message from Azure Function", new Exception("This is a dummy exception"));
   
}

Summary

By no means does this post cover every aspect of these two logging approaches or all possible logging approaches for Azure Functions.  In future posts I will also cover AppInsight.  In any case it is always important to have logging for you application.  Find the tool that works for your team and implement it.


          Building Azure Functions: Part 3 – Coding Concerns      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2017/02/02/building-azure-functions-part-3-ndash-coding-concerns.aspx

Image result for azure functions logo

In this third part of my series on Azure Function development I will cover a number of development concepts and concerns.  These are just some of the basics.  You can look for more posts coming in the future that will cover specific topics in more detail.

General Development

One of the first things you will have to get used to is developing in a very stateless manner.  Any other .NET application type has a class at its base.  Functions, on the other hand, are just what they say, a method that runs within its own context.  Because of this you don’t have anything resembling a global or class level variable.  This means that if you need something like a logger in every method you have to pass it in.

[Update 2016-02-13] The above information is not completely correct.  You can implement function global variables by defining them as private static.

You may find that it makes sense to create classes within your function either as DTOs or to make the code more manageable.  Start by adding a .csx file in the files view pane of your function.  The same coding techniques and standards apply as your Run.csx file, otherwise develop the class as you would any other .NET class.

image

In the previous post I showed how to create App Settings.  If you took the time to create them you are going to want to be able to retrieve them.  The GetEnvironmentVariable method of the Environment class gives you the same capability as using AppSettings from ConfigurationManager in traditional .NET applications.

System.Environment.GetEnvironmentVariable("YourSettingKey")

A critical coding practice for functions that use perishable resources such as queues is to make sure that if you catch and log an exception that you rethrow it so that your function fails.  This will cause the queue message to remain on the queue instead of dequeuing.

Debugging

image

It can be hard to read the log when the function is running full speed since instance run in parallel but report to the same log.  I would suggest that you added the process ID to your TraceWriter logging messages so that you can correlate them.

Even more powerful is the ability to remote debug functions from Visual Studio.  To do this open your Server Explorer and either connect to your Azure subscription.  From there you can drill down to the Function App in App Services and then to the run.csx file in the individual function.  Once you have open the code file and place your break points, right-click the function and select Attach Debugger.  From there it acts like any other Visual Studio debugging session.

image

Race Conditions

I wanted to place special attention on this subject.  As with any highly parallel/asynchronous processing environment you will have to make sure that you take into account any race conditions that may occur.  If at all possible keep the type of functionality that your create to non-related pieces of data.  If it is critical that items in a queue, blob container or table storage are processed in order then Azure Functions are probably not the right tool for your solution.

Summary

Azure Functions are one of the most powerful units of code available.  Hopefully this series gives you a starting point for your adventure into serverless applications and you can discover how they can benefit your business.


          Building Azure Functions: Part 2–Settings And References      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2017/02/01/building-azure-functions-part-2ndashsettings-and-references.aspx

Image result for azure functions logo

This is the second post in a series on building Azure Functions.  In this post I’ll continue by describing how to add settings to your function and reference different assemblies to give you more capabilities.

Settings

image_thumb1[1]

Functions do not have configuration files so you must add app settings and connection strings through the settings page.  The settings are maintained at an Function App level and not individual functions.  While this allows you to share common configuration values it means that if your custom assemblies need different values in configuration settings per function they will each function will have to live in a separate function app.

To get to them go to the Function App Settings link at the lower left of your App Function’s main page and then click the Configure App Settings button which will bring you to the blade shown below.  At that point it is the same any .NET configuration file.

image

At some point I would like to see the capability of importing and exporting settings since maintaining them individually, by hand leads to human error and less reliable application lifecycle management.

Another drawback to the Azure Functions development environment is that at the time of this post you don’t have the ability to leverage custom configuration sections.  The main place I have found this to cause heartburn is using logging libraries such as log4net where the most common scenario is to use a custom configuration section to define adapters and loggers.

Referencing Assemblies And Nuget

No .NET application is very useful if you can’t reference all of the .NET Framework as well as third party and your own custom assemblies.  There is no add references menu for Azure functions and there are multiple ways to add references.  Lets take a look at each.

There are a number of .NET assemblies that are automatically referenced for your Function application.  There are a second group of assemblies that are available but need to be specifically reference.  For a partial list consult the Azure Function documentation here.  You can also load your own custom assemblies or bring in Nuget packages. 

In order to load Nuget packages you need to create a project.json file.  Do this by clicking the View Files link in the upper right corner of the editor blade and then the Add link below the file list pane. 

project.json files require the same information that is contained in packages.config file, but it is formatted in json as shown in the example below.  Once you save this file and reference the assembly in your Run.csx file Azure will load the designated packages.

image_thumb8

If you have custom libraries that you want to leverage you will need to add a bin folder to your function.  The easiest way I have found to do this is to open the App Service Editor from the Function App Settings page.  This will open up what is essentially Visual Studio Code in a browser.  Navigate the file tree to your function under wwwroot.  Right click your function name and select New Folder.  The folder must be named “bin”.  You can then right click the bin folder and upload your custom assemblies.

Once you have an assembly available you need to reference it using the “r#” directive as shown below.  You will notice that native assemblies and Nuget package loaded libraries do not need the dll extension specified, but they must be added for custom assemblies.

#r "System.Xml"
#r "System.Xml.Linq"
#r "System.Data.Entity"
#r "My.Custom.Data.dll"
#r "My.Custom.Domain.dll"
#r "Newtonsoft.Json"
#r "Microsoft.Azure.Documents.Client"
#r "Microsoft.WindowsAzure.Storage"

Now we are ready to declare our normal using statements and get down to the real business of functions.

Summary

After this post we have our trigger, bindings, settings and dependent assemblies.  This still isn’t enough for a useful function.  In the next post I will cover coding and debugging concerns to complete the story.


          Building Azure Functions: Part 1–Creating and Binding      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2017/01/31/building-azure-functions-part-1ndashcreating-and-binding.aspx

Image result for azure functions logo

The latest buzz word is serverless applications.  Azure Functions are Microsoft’s offering in this space.  As with most products that are new on the cloud Azure Functions are still evolving and therefore can be challenging to develop.  Documentation is still being worked on at the time I am writing this so here are some things that I have learned while implementing them.

There is a lot to cover here so I am going to break this topic into a few posts:

  1. Creating and Binding
  2. Settings and References
  3. Coding Concerns

Creating A New Function

The first thing you are going to need to do is create a Function App.  This is a App Services product that serves as a container for your individual functions.  The easiest way I’ve found to start is to go to the main add (+) button on the Azure Portal and then do a search for Function App.

image

Click on Function App and then the Create button when the Function App blade comes up.  Fill in your app name remembering that this a container and not your actual function.  As with other Azure features you need to supply a subscription, resource group and location.  Additionally for a Function App you need to supply a hosting plan and storage account.  If you want to take full benefit of Function Apps scaling and pricing leave the default Consumption Plan.  This way you only pay for what you use.  If you chose App Service Plan your function will will pay for it whether it is actually processing or not.

image

Once you click Create the Function App will start to deploy.  At this point you will start to create your first function in the Function App.  Once you find your Function App in the list of App Services it will open the blade shown below.  It offers a quick start page, but I quickly found that didn’t give me options I needed beyond a simple “Hello World” function.  Instead press the New Function link at the left.  You will be offered a list of trigger based templates which I will cover in the next section.

image

Triggers

image

Triggers define the event source that will cause your function to be executed.  While there are many different triggers and there are more being added every day, the most common ones are included under the core scenarios.  In my experience the most useful are timer, queue, and blob triggered functions.

Queues and blobs require a connection to a storage account be defined.  Fortunately this is created with a couple of clicks and can be shared between triggers and bindings as well as between functions.  Once you have that you simply enter the name of the queue or blob container and you are off to the races.

When it comes to timer dependent functions, the main topic you will have to become familiar with is chron scheduling definitions.  If you come from a Unix background or have been working with more recent timer based WebJobs this won’t be anything new.  Otherwise the simplest way to remember is that each time increment is defined by a division statement.

image

In the case of queue triggers the parameter that is automatically added to the Run method signature will be the contents of the queue message as a string.  Similarly most trigger types have a parameter that passes values from the triggering event.

Input and Output Bindings

image

Some of the function templates include an output binding.  If none of these fit your needs or you just prefer to have full control you can add a binding via the Integration tab.  The input and output binding definitions end up in the same function.json file as the trigger bindings. 

The one gripe I have with these bindings is that they connect to a specific entity at the beginning of your function.  I would find it preferable to bind to the parent container of whatever source you are binding to and have a set of standard commands available for normal CRUD operations.

Let’s say that you want to load an external configuration file from blob storage when your function starts.  The path shown below specifies the container and the blob name.  The default format show a variable “name” as the blob name.  This needs to be a variable that is available and populated when the function starts or an exception will be thrown.  As for your storage account specify it by clicking the “new” link next to the dropdown and pick the storage account from those that you have available.  If you specified a storage account while defining your trigger and it is the same as your binding it can be reused.

image

The convenient thing about blob bindings is that they are bound as strings and so for most scenarios you don’t have to do anything else to leverage them in your function.  You will have to add a string parameter to the function’s Run method that matches the name in the blob parameter name text box.

Summary

That should give you a starting point for getting the shell of your Azure Function created.  In the next two posts I will add settings, assembly references and some tips for coding your function.


          Cloud Battles: Azure vs AWS–The Video      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2016/06/29/cloud-battles-azure-vs-awsndashthe-video.aspx

Earlier this month Norm Murrin and I gave a talk at the Chicago Coder Conference.  We learned a lot about how the offerings of each company compares during our preparation.  In the end we come to the conclusion that there is no clear winner except those of us who are leveraging the resources.  Check out this video posted by the conference do get the blow-by-blow details.


          Application Integration: Azure Functions Vs WebJobs      Cache   Translate Page      

Originally posted on: http://blog.loethen.net/archive/2016/06/02/application-integration-azure-functions-vs-webjobs.aspx

image

[Updated]

UI development gets all the attention, but application integration is where the real work is done.  When it comes to application integration in the Azure ecosystem you better learn how Functions and WebJobs are developed and under what conditions you should use each.  In this post I will try to answer those questions.

For me it is important that a solutions is reasonably maintainable, deployable through environments and can be easily managed under source control.

Both products are built on the same code base and share the same base API.  From that perspective they are closely matched.  Functions do have the advantage of handling web hooks as opposed to simply timer and storage events with WebJobs.

There is another difference that I haven’t been able to prove you, but I’ve seen mentioned in a couple of places.  It seems like Functions may take time to warm up since they aren’t always instantiated.  Since WebJobs are always running they would not incur this startup cost.  If immediate processing is important then WebJobs may be the more appropriate options for you.

When it comes to actual development I prefer to have the resources of Visual Studio to write and manage source code as well as package my deliverables for deployment.  As of this writing I have not been able to find a Visual Studio project type.  This means you edit the code through a web browser.  This in portal editor does allow you to integrate with Git or VSTS for source control.  I would expect at some point in the future we will get a Functions project type.

Both WebJobs and Functions can be written using C#/VB.NET and Node.js.  From the language availability perspective they are even.

Summary

So what is the real separating line between using one or the other.  From what I have experienced so far, if you need the web hooks then Functions are the right choice.  If you don’t need the web hooks and maintainability is you priority then WebJobs are the way to go.  I’m sure there are more reason, but these are the most obvious in the early days of Functions.  As the products evolve I’ll post updates.

[Update]

Christopher Anderson (@crandycodes) from the Azure team replied via Twitter with the following:

You hit on some key points like lack of tooling/VS integration. We plan on addressing those before GA.
I think the major point missing is the dynamic scale functionality, pay per use. Functions scale automatically and don't cost a VM.
Also, if you run Functions in dedicated with always on, there is no cold start issues, but you pay per VM at that point.
WebJobs vs Functions is really: "Do I want to manage my own custom service?" Yes: WebJobs, No: Functions. Otherwise, similar power.


          filas de notre élévage      Cache   Translate Page      
Femelles Abaluca Of Fazenda dos Amigos da Vida Abia Astelle Of Fazenda dos Amigos da Vida Avalanche Of Fazenda dos Amigos da Vida Dolores Ze Sakaliho dvora Dulcinea Ze Sakaliho dvora Endira Adriana Ze Sakaliho dvora Azure, dit Heinzi Lora Straznicky raj Ursella et Uma Of Fazenda dos Amigos da Vida Uriah Of Fazenda dos Amigos da Vida Baldusa Of Fazenda dos Amigos da Vida
          Azure      Cache   Translate Page      

kiara.marino111 posted a photo:

Azure


          Comment on Use Hybrid Connections to Incrementally Migrate Applications to the Cloud by Marek      Cache   Translate Page      
In our usage scenario we are using a web app with hybrid connections to provide reverse proxy functionality, accessing web servers on remote devices behind firewalls. The issues we are having are: - Total lack of programmatic way to provision new hybrid connection - Limitation of max 200 devices even in the most expensive plan - Slow connection to geographical regions further away, like Asia Will you be addressing the above issue in the near future? Can you propose any alternative solution within or outside Azure that could provide similar usage scenario without the issues?
          Comment on Use Hybrid Connections to Incrementally Migrate Applications to the Cloud by Andrew B Hall - MSFT      Cache   Translate Page      
Thank you for the extremely well thought out and written comments here. I follow up with the Hybrid Connection team and pass along the feedback and questions around guidance. A few things to note: 1. Agreed, for many organizations developers with not have direct privileges to set this up themselves 2. You are correct there are limits on the number of Hybrid Connections which increase with higher plans (see https://docs.microsoft.com/en-us/azure/app-service/app-service-hybrid-connections#hybrid-connections-and-app-service-plans). The Basic plan does a limit of 5 hybrid connections, but deployment slots require at least a Standard App Service Plan (https://azure.microsoft.com/en-us/pricing/details/app-service/plans/) which supports up to 25 hybrid connections (still may not be enough, but is significantly more than 5) 3. Hybrid Connections are ultimately associated with a Relay resource, so you can share a Hybrid Connection between different app services if they have access to the Relay. I will pass along the feedback about guidance. 4. Thanks for the feedback, I will pass this along 5. Fair point, checking the status does require having the appropriate level of access
          Introducing ‘Suggest a Feature’ in Developer Community      Cache   Translate Page      

Customer feedback is a critical input to help us improve Visual Studio. Up until two years ago, the Visual Studio customer feedback system left room for improvement – customers could use the “send a smile” feature in Visual Studio, but this would result in only coarse-grained feedback such as “I like this” or “I don’t like this.” The feedback we got through this UI then went into a database our team accessed, but didn’t leave an easy way for customers to see the feedback that other customers were giving so they could say, “I have that problem too!” More than that, the back-end system that gathered feedback was separate from the engineering systems we use for tracking bugs and features, crash reports, hang reports, and telemetry. Without a unified system on the backend, identifying the most impactful issues could be an error-prone job.

In Visual Studio 2017, we introduced a new system for reporting issues. From within Visual Studio, users can Report a Problem; the issue is tracked in a public system at https://developercommunity.visualstudio.com, and the public issue is integrated directly into our engineering system. Thus, a bug on Developer Community is a bug in our Azure DevOps work item database, and votes from the community would give items additional weight.

Since introducing this system, we have received over 70,000 issues from more than 24,000 customers. We have resolved over 13,000 of them, and in over 5,000 instances of issues customers have unblocked themselves quickly using the solutions and workarounds contributed by Microsoft and the larger community of developers.

Until now, we’ve focused the system on issue tracking, but that’s left a gap in our understanding of what customers want from us and developers need: the ability to ask for features. Today, we’re announcing that the same system that we use for reporting issues will work for feature requests in one convenient place.  If you have an idea or a request for a feature, you can now use the new Suggest a Feature button (shown below) on Developer Community and make your suggestions.

You can also browse suggestions from other developers and Vote for your favorite features to help us understand the impact to the community.

If you are wondering how feature suggestions are considered for inclusion on our product roadmap, please see our suggestions guide that clarifies the states used to track progress and what each state represents in the journey.

As part of this work, we will transition from UserVoice as the primary means for our customers to request features. The reason we’re moving from UserVoice is very similar to the reason we moved away from the old “send a smile” system: we want customer feature requests to be directly integrated into our core engineering system so we have complete line of sight into the feedback customers are giving us.

Moving off UserVoice is a big job: It tracks 32,000 feature requests and more than 24,000 points of  feedback. In unifying our system under Developer Community, these massive number of requests will benefit from a strong systems to help us analyze and track these important requests.

Over the coming months, we’ll be migrating many existing UserVoice items from UserVoice, starting with a few hundred, although tems with relatively lower vote counts and items that are exceptionally old may not move over. Please see these frequently asked questions for more details on how UserVoice votes and comments are being handled.  I recognize that may cause some frustration and I apologize in advance.  I’d simply ask you to sign into Visual Studio or go on the web and submit the item again.

Thank you!

Suggest a Feature on Developer Community is aimed at providing a single convenient place for all your Visual Studio feedback, improve your engagement with product teams and have a greater impact on Visual Studio family of products.  We are looking forward to hearing your suggestions.  To reiterate in closing, please visit Developer Community and check out the new experience.  We also encourage you to learn more about suggestions to get the best out of it.  Thank you for the valuable feedback you provide in Visual Studio and participation in Developer Community!

John Montgomery, Director of Program Management for Visual Studio
@JohnMont

John is responsible for product design and customer success for all of Visual Studio, C++, C#, VB, JavaScript, and .NET. John has been at Microsoft for 17 years, working in developer technologies the whole time.


          Improved governance experience with Ethereum Proof-of-Authority 1.2      Cache   Translate Page      

Since launching Ethereum Proof-of-Authority we've received great feedback and have learned more about the ways our customers have leveraged this solution to roll out their Blockchain applications. We’ve rolled out a number of features that improve user-experience, configuration, and deployment reliability.

Governance DApp

This update comes with a new governance experience that makes consortium management more intuitive.

The Governance DApp is used for admin management and validator delegation. Each admin can select a set of validators which will propose blocks within PoA consensus. Admins also have the power to vote either to add or remove other admins. This form of on-chain governance helps decentralize the power of network operation and provides a familiar mechanism to maintaining a healthy network over time.

image

image

Please note, that this new UI will not be compatible with previously deployed networks of Proof-of-Authority (PoA).

WebSocket support

We’ve added WebSocket support to make it easy to subscribe to events directly or connect to external tools and applications such as BlockScout, an open-source block explorer. You can locate the WebSocket endpoint as part of the deployment output or post-deployment email.

image

BlockScout block explorer

We have also put together a new deployment guide with instructions on how to setup BlockScout with a new Proof-of-Authority deployment. BlockScout allows you to have a transparent view into the blockchain. You can easily search by the transaction, user address, contact address, and block number.

image

Just-In-Time (JIT) VM Access and Azure Backup Support

With production readiness in mind, we’ve enabled support for JIT VM access and Azure Backup Support. JIT VM Access allows you to reduce the potential for attacks by tightly controlling how members within your organization procure access to the VM. Azure Backup provides the ability to create scheduled backups of your VM hard drives. This presents an easy way to handle disaster recovery and prevent loss of critical on-chain data.

VM SKU selection

We’ve performed extensive performance testing on the network and have tuned the VM selection to provide clearer options and documentation, to make it more intuitive when selecting the right VM SKU. Explore the tool that we’ve used for performance benchmarking.

More configuration options

Before deployment, you can now specify the starting block gas limit and block reseal time. Block gas limit will influence the size of each block, while the block reseals time will control how frequently blocks are generated in the case of empty transactions. A high block reseals time will decrease the disk consumption rate but will affect block finality in networks that have sparse transaction throughput.

image

Improved reliability

The ARM template will perform additional validation after each deployment to ensure that the network has started up correctly. Additionally, Azure Monitor deployment reliability has been improved by deploying the Azure Monitor components in series.

Give us feedback

You can quickly share your feedback with the team by clicking on the smiley face icon in the Governance DApp. If you face any issues along the way, reach out on our support forum to get unblocked.


          Driving identity security in banking using biometric identification      Cache   Translate Page      

Combining biometric identification with artificial intelligence (AI) enables banks to take a new approach to verifying the digital identity of their prospects and customers. Biometrics is the process by which a person’s unique physical and personal traits are detected and recorded by an electronic device or system as a means of confirm identity. Biometric identifiers are unique to individuals, so they are more reliable in confirming identity than token and knowledge-based methods, such as identity cards and passwords. Biometric identifiers are often categorized as physiological identifiers that are related to a person’s physicality and include fingerprint recognition, hand geometry, odor/scent, iris scans, DNA, palmprint, and facial recognition.

image

But how do you ensure the effectiveness of identifying a customer when they are not physically in the presence of the bank employee? As the world of banking continues to go digital, our identity is becoming the key to accessing these services. Regulators require banks to verify that users are who they say they are, not bad actors like fraudsters or known money launderers. And verifying identities online without seeing the person face to face is one of the biggest challenges online and mobile services face today.

It’s problematic because identity documents were created to be verified in person. For example, you can shine an infrared light, you can feel the texture, or you can see if a photo has been stuck on. But with remote verification, you’re just dealing with an image. Without the physical artifact, you only have the human eye to rely on and this makes the task of verification much harder to do quickly, or accurately.

A complicating factor is the online user experience. Users expect a fast, frictionless process in all that they do. If they have to wait, or if the process is too fiddly, they’ll go elsewhere. In the banking sector, almost half of all people who start opening an online bank account drop off due to a bad user experience.

This is where identity assurance is needed for online and mobile onboarding processes. Identity assurance is the ability for the bank to determine, with a high level of certainty, that an electronically provided credential representing a person can be trusted to serve as a proxy for that individual and not someone else. Assurance levels (ALs) are levels of trust associated with a credential as measured by the supporting technology, processes, procedures, policies, and operational practices.

To facilitate the assurance part of the customer onboarding process, the bank must have an innovative identity verification technology (IVT) to ensure that customers provide information that is associated with the identity of a real person. Physical authenticity identity documents like passports, drivers permits, and other documents are used to compare against government or service databases. Fraudsters continually strive to debunk bank processes to perform account takeovers, system infiltrations, and unauthorized transactions. Fraud detection is tough! In many cases it’s easy to alter content, images, and verification digits of common identification resources.

To combat these methods, Microsoft has many partners leveraging our Azure Cognitive Services – Vision API Platform and Azure Machine Learning. One such partner Onfido, provides a multi-factor identity verification service that helps accurately verify online users, uses a cloud-based risk assessment platform that leverages artificial intelligence to automate and scale traditionally human-based fraud expertise to derive identity assurance. The Onfido service validates physical identity documents (document validation), verifies biometric inputs (biometric identity verification), and analyzes information an end user provides about themselves (ID validation). These techniques give companies a measurable assurance that the person is who they say they are.

Biometric identity verification

Onfido verifies identities through two factor verification:

  • Something you have, such as a government issued ID (driver’s license, passport or ID card).
    • Document validation answers this question. Is it authentic?
  • Something you are, such as your facial biometrics.
    • Feature and attribute validation answers these questions. Is there a match, and are they alive?
  • Biometric identity is quite robust, an identity document is the most legally binding proof of identity, while remaining user friendly. The face is the easiest biometric to capture using mobile devices.
    • In figure below, the first step is to verify that a document is genuine. Onfido has several different algorithms to test for different fraud techniques.
    • The second step is to match the photo on the document with the selfie taken by the customer. Rather than take a static selfie, customers can also choose a video option which asks the user to perform randomised movements such as turning their head and voice commands. This prevents using deceitful practices, impersonation, or spoofing attempts. 

image

Figure 2

Harnessing the power of Microsoft Azure's Cognitive Services, Onfido helps clients adhere to their due diligence requirements via an effective, compliant and robust digital verification experience.

Together, Microsoft and Onfido deliver an easy onboarding experience for users through a scalable and automated process. The solution addresses compliance needs and reduces fraud costs associated with identity theft. This helps our clients to build trust and integrity within their community.

Want to learn more about combating online and mobile fraud? First, read the Detecting Online and Mobile Fraud with AI use case providing actionable recommendations and solutions. This will provide information on solutions and guidance for you to get started, as well as information on many other partners that also provide identity verification solutions.

Make sure you also check out more Azure partners on the Azure Marketplace. Then, engage with the author on this topic by reaching out to me via LinkedIn and Twitter.


          Making HIPAA and HITRUST compliance easier      Cache   Translate Page      

Many healthcare organizations are starting to adopt artificial intelligence (AI) systems to gain deeper insight into operations, patient care, diagnostic imaging, cost savings and so on. However, it can sometimes be daunting to even know where to get started. Many times, you need a clear lighted path to start your journey and embrace AI and machine learning (ML) capabilities rapidly.

image

One method is using an Azure Healthcare AI blueprint. It’s a shortcut to using Microsoft Azure at low cost and without deep knowledge of cloud computing. Blueprints include resources such as example code, test data, security, and compliance support. The largest advantage of using a blueprint is explicit advice and clear instructions on keeping your solution in compliance. We’re trying to eliminate the mystery, so you don’t have to research it yourself.

Three core areas where the blueprint can help with compliance are cloud provider and client responsibilities, security threats, and regulatory compliance. These three areas can get overlooked at the beginning of any technology project, yet they are important parts of creating healthcare systems. Applying formal discipline to these areas is made easier by using the blueprint to create an AI/ML experiment installation.

Helpful artifacts

The blueprint includes a script to create an AI/ML system, complete with a sample experiment. It also includes several documents to help system implementers keep their installations secure and compliant. These include worksheets, whitepapers, and spreadsheets that will help you ensure system compliance with healthcare regulations and certifications. The artifacts are easily re-purposed for other healthcare-based systems implemented on Azure.

Clarifying responsibilities

When creating any system on a cloud platform, there are two possible owners for any part of the solution, the cloud provider and the customer. It is important to know who is responsible for specific actions, services, and other operational details. Without a clear understanding of this delineation, customers or vendors may find themselves in a difficult situation if an issue arises, like service outages or security breaches. Therefore, it is in everyone’s interest to be clear about the responsibilities of design and operations.

Preventing misunderstandings and setting clear expectations of responsibilities is the goal of the Shared Responsibilities for Cloud Computing document. If you are trying to meet HITRUST certification standards, the HITRUST Customer Responsibilities Matrix spreadsheet identifies exactly what Microsoft and the customer are respectively responsible for managing.

Planning for security threats

Before creating complex systems, it is always advisable to perform a threat assessment. It is a best practice to create a threat assessment model. It helps you to visualize the system and find the points of vulnerability in the proposed architecture. This leads to conversations about where the system may be improved and hardened against attacks.

Microsoft provides a Threat Model Tool enabling architects to identify and mitigate potential security issues early, when they are relatively easy and cost-effective to resolve. The blueprint includes a model to be used with the tool. This comprehensive threat model provides insights into the potential risks of the architecture and how they may be mitigated.

A standard approach to security threat analysis involves identifying the surface area of your system, creating a model of that surface area, identifying potential threats, mitigating them and validating each mitigation, updating the threat model as you proceed. The following diagram highlights the major phases this process.

The figure below shows four stages: diagram, identify, mitigate, and validate.

image

Figure 1: Security cycle

This process flow provides an iterative and collaborative approach to threat analysis that ultimately helps create a more robust and secure system architecture.

Regulatory compliance

Healthcare systems need to meet regulatory compliance standards. At installation, the blueprint complies with HIPAA and HITRUST requirements. Whitepapers are included to help you understand how to continue to meet these requirements. Let’s examine the whitepapers and other provided artifacts to see how they might help.

HITRUST certification

The Common Security Framework (CSF) from HITRUST is a security standard for healthcare systems. The HITRUST compliance review whitepaper was published to aid in ensuring the healthcare blueprint meets CSF regulations. The whitepaper states:

“This whitepaper constitutes a review of the Blueprint architecture and functionality with respect to HITRUST-certified customer environments, examining how specifically it can satisfy HITRUST CSF security requirements.”

The whitepaper helps organizations plan their cloud implementation and understand how to meet HITRUST CSF compliance.

HIPAA compliance built into the blueprint

Compliance with HIPAA standards is fundamental to any healthcare organization. The blueprint was created with HIPAA in mind, and includes a whitepaper covering the topic in detail.

The HIPAA compliance review whitepaper is similar to the HITRUST whitepaper in its intent, to help organizations reach regulatory compliance. This document guides readers through the architecture, a shared responsibility model and deployment considerations for your solution. Protected healthcare information (PHI), a fundamental practice in well-designed system architectures, is also included in the whitepaper.

Recommended next steps

Use the supporting collateral below to prepare for your installation of the blueprint. The artifacts demonstrate how responsibilities, compliance, and security are established and how you can maintain them going forward.

Prepare for installation and ongoing maintenance with the following documents.

Collaboration

What other artifacts or considerations do you think would be helpful when putting healthcare systems into production? Your comments and recommendations are welcome below. I regularly post on technology in healthcare topics. Reach out and connect with me on LinkedIn or Twitter.


          Project xCloud, nuova piattaforma di game streaming da Microsoft      Cache   Translate Page      
Author: GAMEmag – Videogames Project xCloud, annunciato per la prima volta a E3 2018, si baserà sulla tecnologia di streaming di Microsoft Azure e renderà i giochi per Xbox One e Windows 10 disponibili anche sui tablet e sugli smartphone. La loro grafica, infatti, verrà elaborata in remoto e non richiederà dunque hardware locale particolarmente […]
          Systems Engineer – Data Analytics (Azure) - AlixPartners - Detroit, MI      Cache   Translate Page      
AlixPartners is a proud Silver award-winning Veteran Friendly Employer. AlixPartners is a results-driven global consulting firm that specializes in helping...
From AlixPartners - Tue, 11 Sep 2018 06:09:11 GMT - View all Detroit, MI jobs
          Microsoft ha annunciato Project xCloud      Cache   Translate Page      

Microsoft ha annunciato oggi con dovizia di particolari Project xCloud, un nuovo progetto di tecnologia cloud per streaming di videogiochi. Esso vuole permettere agli utenti di accedere alle migliori esperienze di gioco da qualsiasi dispositivo e in ogni momento. Per realizzare tutto ciò, Microsoft ha unito le funzionalità offerte da Microsoft Azure e Microsoft Research per creare una tecnologia di …

L'articolo Microsoft ha annunciato Project xCloud proviene da IlVideogioco.com.


          #GooglePlusRefugees      Cache   Translate Page      
The Federation


Hello, Friends!
If you are like me saddened by the shutdown of Google+ for "consumers" (See below), may I interest you in some alternatives?


https://the-federation.info/

The Federation is a tracker of sorts of different federated social networks, from the list above you can see that there are a few you can use.

Remember to say "hi" with the "hashtag" #GooglePlusRefugee or #GPlusRefugee

You need to login to view this link
          Comment on Install Sitecore 9 in Azure with an MSDN Subscription – Cheap*! by Matt Connolly      Cache   Translate Page      
Yes, XM will also use Application Insights and you will run into the same problem using an MSDN subscription.
          Comment on Install Sitecore 9 in Azure with an MSDN Subscription – Cheap*! by Jagan      Cache   Translate Page      
Hi Matt, Greetings. Excellent article. I'm planning to deploy XM scaled using msdn subscription. Will face similar spending limit issue?
          Skomentuj PO zawiodła w sprawie prezydenta Legionowa, którego autorem jest 60vito      Cache   Translate Page      
do Jacka NH Lubisz generalizować, więc co powiesz o wyczynach Zbonikowskiego, Pięty, czy dzisiejszej wypowiedzi Mazurek, nazywała Tuska perekińczykiem, czyli zdrajcą. Co chwila pojawia się kolejny ksiądz pedofil. Mogę napisać ,, księża już tacy są"? Chamstwo Mazurek okreslę więc,,pissiaki już tacy są". Oba te stwierdzenia będą bliższe prawdy niż twoja opinia o peowcach, wśród których seksizm nie jest powszechnym zjawiskiem.
          Comment on Navigation Update for Azure DevOps by Devin M.      Cache   Translate Page      
I switched back to the old layout solely for the backlogs view. In the existing version you can list all of the past current and future sprints along the left. Then it's super easy as a PM to drag work items around to other iteration paths. Drag a story and all the tasks underneath it come with it. In the new display I never saw this option and could only move from one sprint to another through a little tiny list in the upper right. I use this functionality every day and it will be frustrating to lose it.
          Comment on Navigation Update for Azure DevOps by Aaron Bjork      Cache   Translate Page      
Yes, we're looking at some designs to improve this right now.
          Comment on Navigation Update for Azure DevOps by Aaron Bjork      Cache   Translate Page      
It's still there. Click the View options command in the menu (top right of the page). From there, you can turn on the Work Details pane... which has the capacity bars you're referring to. This pane is now available on all the sprint pages - Tasbkaord, Sprint Backlog, Capacity.
          Comment on Navigation Update for Azure DevOps by Ashraf M.      Cache   Translate Page      
One major issue about VSTS or AzDo is the fact that if you enter a bug and save, it just disappears from the screen. There is no quick view of "last work" done. This was available in TFS. There is so much room to make this tool much more user-friendly. Is there any plan for adding this?
          Comment on Navigation Update for Azure DevOps by Ashraf Madina      Cache   Translate Page      
One major issue about VSTS or AzDo is the fact that if you enter a bug and save, it just disappears from the screen. There is no quick view of "last work" done. This was available in TFS. There is so much room to make this tool much more user-friendly. Is there any plan for adding this?
          Comment on Navigation Update for Azure DevOps by Matthew Mitrik (MS)      Cache   Translate Page      
@Bo, @Chris - Adding on to Aarons' comment, the https://dev.azure.com/your_org_here/_pulls page is a refreshed version of the former My Pull Requests experience. It has the exact same capabilities to show all of your PRs, including those assigned to teams you're a member of, and it shows all of the same metadata about recent updates and votes. It's changed a lot since we initially enabled it in the new navigation, so check it out if you haven't recently looked at it. The second experience - showing the list of PRs in a panel is a slimmed down version of that page, only including the PRs you created and have been directed assigned. Coming soon to the panel is a link to "view more" that will help you navigate to the _pulls page if you need to get to a PR that's not shown in the panel.
          TCS Named Azure Expert Managed Service Provider By Microsoft      Cache   Translate Page      

Click to view a price quote on TCS.

Click to research the Specialty Retail industry.

          Kubernetes on Google, Azure and AWS Compared      Cache   Translate Page      
Comments
          Software Developer - Jobline Resources Pte Ltd - Paya Lebar      Cache   Translate Page      
Familiar with cloud environment – AWS, Azure, Google Cloud, principals of, and execution in, DevOps. The development environment consists of AWS Cloud and...
From Jobline Resources Pte Ltd - Wed, 05 Sep 2018 09:35:06 GMT - View all Paya Lebar jobs
          DevOps Engineer - Ethos BC Global - Singapore      Cache   Translate Page      
Public clouds such as AWS, Google Cloud or Azure. Financial and emerging markets....
From Ethos BC Asia - Tue, 09 Oct 2018 16:25:17 GMT - View all Singapore jobs
          Hombre usaba meteorito de 100.000 dólares para trabar puerta      Cache   Translate Page      
GRAND RAPIDS, Michigan, EE.UU. (AP) — Un hombre de Michigan se enteró recientemente que la piedra que utilizaba para trabar la puerta de su casa es un meteorito valorado en 100.000 dólares. El Museo Smithsonian y la Universidad Central de Michigan dijeron que el trozo de hierro y níquel de más de 10 kilogramos (casi 23 libras) es el sexto meteorito más grande encontrado en Michigan. David Mazurek dijo que llevó la piedra a la universidad luego de leer en enero informes de que trozos de meteoritos se vendían por miles de dólares. “Me dije ‘A ver, a ver. Cuánto valdrá el mío’”, declaró Mazurek.
          Software Engineer II - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 04 Oct 2018 19:23:52 GMT - View all Redmond, WA jobs
          Principal Software Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 04 Oct 2018 19:23:52 GMT - View all Redmond, WA jobs
          Principal Program Manager - Microsoft - Redmond, WA      Cache   Translate Page      
Job Description: The Azure Big Data Team is looking for a Principal Program Manager to drive Azure and Office Compliance in the Big Data Analytics Services ...
From Microsoft - Sat, 28 Jul 2018 02:13:20 GMT - View all Redmond, WA jobs
           Microsoft Announces ‘Secret’ Cloud Capability, Closes In on Amazon       Cache   Translate Page      

Microsoft officials announced Tuesday that the company had achieved the required security levels to host secret U.S. military and intelligence data on its could computer network, Azure, and claimed they were on track to to host “top secret” information soon. The developments put the computer giant in closer competition with cloud rival Amazon to handle the government’s most delicate and important information and perhaps to vie for the Pentagon’s coveted nearly $10 billion cloud contract known as JEDI.


          Azure Développer - Technologie Delan - Montréal, QC      Cache   Translate Page      
Very good working knowledge of Azure Learning Machines / HD insight / Spark / Databricks or any other cloud environment;...
From Technologie Delan - Tue, 09 Oct 2018 17:09:06 GMT - View all Montréal, QC jobs
          Développeur Azure - Technologie Delan - Montréal, QC      Cache   Translate Page      
Le Développeur Azure sera responsable de participer à la conception, au déploiement et l’amélioration continue des applications de l’entreprise PLUS...
From Technologie Delan - Tue, 09 Oct 2018 17:09:05 GMT - View all Montréal, QC jobs
          Consultant MSBI Senior (H/F) – Montréal – Perm – Jusqu’à 100k CAD - Elitesoft - Montréal, QC      Cache   Translate Page      
Canada USA US Montréal Québec Azure Machine Learning Azure HD insights MSBI Microsoft BI Business Intelligence SSIS SSRS SSAS Power BI Cubes OLAP... $60,000 - $100,000 a year
From Indeed - Fri, 28 Sep 2018 14:24:37 GMT - View all Montréal, QC jobs
          Développeur Big Data - EXIA - Montréal, QC      Cache   Translate Page      
Posséder de l’expérience sur Microsoft Azure / HD Insight (Hortonworks) ; Vous êtes passionné par le Big Data, vous êtes familier avec les écosystèmes Pig, Hive...
From EXIA - Tue, 18 Sep 2018 11:29:35 GMT - View all Montréal, QC jobs
          Azure Education Governance Series      Cache   Translate Page      
One of the challenges education customers face when starting to adopt Azure is understanding how they can effectively use services and still address the need for governance.  In our experience, customers that address governance early on in their public cloud journey have a higher probability of success while reducing compliance/security issues that typically arise from...
          Join us: Women Leading Government Cybersecurity – Oct. 24 in Washington, DC      Cache   Translate Page      
In support of National Cybersecurity Awareness Month, sponsored by the Department of Homeland Security, we invite you to RSVP and join us for a special edition Microsoft Azure Government Meetup, Women Leading Government Cybersecurity, Wednesday, Oct. 24 from 6 – 8:30 p.m., at 1776 in Washington, DC. All are welcome to this free event, which...
          Microsoft expands cloud service in push for $10 billion Pentagon contract      Cache   Translate Page      
Microsoft Corp said on Tuesday its expanded Azure cloud service to help government clients save data on their own servers would be available by the end of the first quarter of 2019, as it battles with Amazon.com for a $10 billion Pentagon contract.

          Alert Logic extends security to cover any container across multiple platforms      Cache   Translate Page      

Alert Logic’s update to the Network Intrusion Detection System (NIDS) for containers adds container log management and extends capabilities beyond Amazon Web Services (AWS) to Microsoft Azure, on-premises and hosted environments. Organizations gain a picture of their risk through visibility into any workload in any container, as well as the ability to collect, aggregate and search container log data for security and compliance. According to 451 Research Principal Analyst Jay Lyman, containers can enable faster … More

The post Alert Logic extends security to cover any container across multiple platforms appeared first on Help Net Security.


          #bulgaria - alluredazure      Cache   Translate Page      
Another beautiful building in Sofia. Disappointed to find out, when we got there, that the baths were no longer in use! 🤦🏻‍♀️ . . . . . . . . . . . . #ohwell #stillpretty #sofia #bulgaria #museums #goldenhour #thermalbaths #beautifulbuildings #architecturephotography #sunsets #lastdaysofsummer #beautifulcity #fountain #placestovisit #european #cities #summerglow #travelgrams #blueskies #ig_europe #ig_travel #wonderful_places #cityscape #cityphotography #ig_sunset #ig_captures #travelideas
          Senior Hadoop Big Data Engineer      Cache   Translate Page      
Contract Senior Hadoop Big Data Engineer - Rate TBC - Dubai Long term contract Dubai Key Skills - Hadoop Cloudera Azure Couch case Sqoop Kafka MDM Data LakesAre you looking to gain commercial exposure to an international client An opportunity to work for one of the most prestigious organisations in the world based in Dubai has arise
          Ubuntu does OpenStack      Cache   Translate Page      

Ubuntu does OpenStack

OpenStack, the open source cloud of choice for many businesses, has seen broad adoption across a large number of industries, from telco to finance, healthcare and more. It’s become something of a safe haven for highly regulated industries and for those looking to have a robust, secure cloud that is open source and enables them to innovate without breaking the bank.

For those of you that don’t know, Ubuntu does OpenStack.

In fact, Ubuntu is the#1 platform for OpenStack and the #1 platform for public cloud operations on AWS, Azure, and Google Cloud, too meaning that we know our stuff when it comes to building and operating clouds.

Which is great news because Canonical, the company behind Ubuntu, helps to deliver OpenStack on rails, with consulting, training, enterprise support and managed operations that help your business to focus on what matters most your applications, not the infrastructure.

Canonical has a pretty compelling story here, and not in a marketing sense where we’ve manufactured an ‘aren’t we great story’.

You see, Canonical and Ubuntu have been a part of open source platform from the start, a founding member, the most widely used distribution, used across the largest operators, the Intel reference platform for telco OpenStack, and much more.

It’s not just about the heritage value of saying ‘we were there from the beginning’, because there is no value in that unless you can back it up with consistently delivering a valuable product to customers.

With Canonical, customers are able to get the most efficient infrastructure as a service on-premises, according to 451 Research . There is also the added benefit of Ubuntu being the most popular public cloud OS and OpenStack, making it a perfect fit for multi-cloud operations.

Then, there is the added bonus of a fully managed OpenStack. Whether it is a shortage of skilled personnel, time constraints, or any other reason for not wanting to build and manage your own deployment, Canonical can do that as well with BootStack, and we’re happy to hand back the keys once you’re ready to take over.

So, if you’re in the market for an OpenStack cloud, just remember Ubuntu does OpenStack.

Start your journey today


          Software Developer      Cache   Translate Page      
AZ-Phoenix, Title: Software Developer Location: Phoenix, AZ Duration: 12 Months Job Description: Position description: The candidate shall have a thorough knowledge of MS SQL database, .Net platforms, and tools relating to the delivery of web applications. Candidate must have experience with website creation or content management systems, setup and management in the Amazon or Azure Cloud. A broad knowledge of
          TCS Named Azure Expert Managed Service Provider by Microsoft      Cache   Translate Page      
...indices such as the Dow Jones Sustainability Index (DJSI), MSCI Global Sustainability Index and the FTSE4Good Emerging Index. For more information, visit us at  www.tcs.com . To stay up-to-date on TCS news in North America , follow  @TCS_NA . For ...

          微软发布10月补丁修复51个安全问题      Cache   Translate Page      
微软于周二发布了10月安全更新补丁,修复了51个从简单的欺骗攻击到远程执行代码的安全问题,产品涉及.NET Core、Azure、Device Guard、Internet Explorer、Microsoft Edge、Microsoft Exchange Server、Microsoft Graphics Component、Microsoft JET Database Engine、Microsoft Office、Microsoft Office SharePoint、Microsoft Scripting Engine、Microsoft Windows、Microsoft Windows DNS、Microsoft XML Core Services、SQL Server、Windows - Linux、Windows Hyper-V、Windows Kernel、Windows Media Player以及Windows Shell。
          Systems Engineer – Data Analytics (Azure) - AlixPartners - Detroit, MI      Cache   Translate Page      
AlixPartners is a proud Silver award-winning Veteran Friendly Employer. AlixPartners is a results-driven global consulting firm that specializes in helping...
From AlixPartners - Tue, 11 Sep 2018 06:09:11 GMT - View all Detroit, MI jobs
          Telecommute Technical Solution Lead in Seattle      Cache   Translate Page      
An IT and service company is searching for a person to fill their position for a Telecommute Technical Solution Lead in Seattle. Candidates will be responsible for the following: Owning the technical vision for the software solution you lead Gathering customer requirements, architect solutions, and assign work to teams Providing client consulting that includes the maintenance and continued enhancement of the back end of our clients' web sites Skills and Requirements Include: Ability to utilize professional development opportunities like conferences and continuing training 2+ years of cloud-based deployments preferable Azure or AWS 5+ years of implementing software architectures of your design 2+ years of Team Lead or Lead Developer experience 2+ years of experience with marketing and web Content Management Systems 5+ years of experience with object-oriented design, software patterns, debugging and refactoring 8+ years of .NET/ASP.NET development experience using C# like Microsoft MVC, WCF, Web API, and WebForms
          May 2010 Chicago Architects Group Meeting      Cache   Translate Page      

Originally posted on: http://tostringtheory.com/archive/2010/04/21/may-2010-chicago-architects-group-meeting.aspx

CAG

The Chicago Architects Group will be holding its next meeting on May 18th.  Please come and join us and get involved in our architect community.

Register

Presenter: Scott Seely 
Topic: Azure For Architects    
 
Location: TechNexus
200 S. Wacker Dr., Suite 1500
Room A/B
Chicago, IL 60606
Time: 5:30 - Doors open at 5:00


          Query Application Insights REST API To Create Custom Notifications      Cache   Translate Page      

Originally posted on: http://tostringtheory.com/archive/2017/08/04/query-application-insights-rest-api-to-create-custom-notifications.aspx

Image result for azure application insights logo

Application Insights is one of those tools that has been around for a number of years now, but is finally getting understood as more companies move to Azure as a cloud solution.  It has become an amazing tool for monitoring the performance of your application, but it can also work as a general logging platform as I have posted before.

Now that you are capturing all this information how can you leverage it?  Going to the Azure portal whenever you want an answer is time consuming.  It would be great if you could automate this process.  Of course there are a number of metrics that you can create alerts for directly via the portal, but what if you want a non-standard metric or want to do something beside just send an alert?

Fortunately Microsoft has a REST API in beta for Application Insights.  It allows you to check standard metrics as well as run custom queries as you do in the Analytics portal.  Let’s explore how to use this API.

In this post will show how to create a demo that implements an Azure Function which calls the Application Insights REST API and then send the results out using SendGrid.  I created them with the VS2017 Preview and the new Azure Functions templates.

Generate Custom Events

First we need some data to work with.  The simplest way is to leverage the TrackEvent and TrackException method of the Application Insights API.  In order to do this you first need to setup a TelemetryClient.  The code below I have as part of the class level variables.

        private static string appInsightsKey = System.Environment.GetEnvironmentVariable("AppInsightKey", EnvironmentVariableTarget.Process);
        private static TelemetryClient telemetry = new TelemetryClient();
        private static string key = TelemetryConfiguration.Active.InstrumentationKey = appInsightsKey; //System.Environment.GetEnvironmentVariable("AN:InsightKey", EnvironmentVariableTarget.Process);

After that it is simple to call the TrackEvent method on the TelemetryClient object to log an activity in your code (be aware it may take 5 minutes for an event to show up in Application Insights).

            telemetry.TrackEvent($"This is a POC event");

Create a VS2017 Function Application

I will have another post on the details in the future, but if you have Visual Studio 2017 Preview 15.3.0 installed you will be able to create an Azure Functions project.

image

Right click the project and select the New Item context menu option and select Azure Function as shown below.

image

On the New Azure Function dialog select TimerTrigger and leave the remaining options as default.

image

Call Application Insights REST API

Once there are events in the customEvents collection we can write a query and execute it against the Application Insights REST API.  To accomplish this the example uses a simple HttpClient call.  The API page for Application Insights can be found here and contains the ULRs and formats for each call type.  We will be using the Query API scenario which will be setup with a couple of variables.

        private const string URL = "https://api.applicationinsights.io/beta/apps/{0}/query?query={1}";
        private const string query = "customEvents | where timestamp >= ago(20m) and name contains \"This is a POC event\" | count";

The call to the service is a common pattern using the HttpClient as shown below.  Add this to the Run method of your new function.

            HttpClient client = new HttpClient();
            client.DefaultRequestHeaders.Accept.Add(
                new MediaTypeWithQualityHeaderValue("application/json"));
            client.DefaultRequestHeaders.Add("x-api-key", appInsightsApiKey);
            var req = string.Format(URL, appInsightsId, query);
            HttpResponseMessage response = client.GetAsync(req).Result;

Process Results

After we have a result we can deserialize the JSON using JSON.NET and send it to our support team via SendGrid.  You will have to add the NuGet package Microsoft.Azure.WebJobs.Extensions.SendGrid.

Modify the signature of your function’s Run method to match the code sample shown here.  In this example “message” is defined as an output variable for the Azure Function which is defined as a binding by using the SendGrid attribute. 

        public static void Run([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, TraceWriter log, [SendGrid(ApiKey = "SendGridApiKey")]out Mail message)

We will also need a structure to deserialize the returned JSON message into. If you look at the message itself it can appear rather daunting but it breaks down into the following class structure.  Create a new class file and replace the default class with this code.

    public class Column
    {
        public string ColumnName { get; set; }
        public string DataType { get; set; }
        public string ColumnType { get; set; }
    }

    public class Table
    {
        public string TableName { get; set; }
        public List<Column> Columns { get; set; }
        public List<List<object>> Rows { get; set; }
    }

    public class RootObject
    {
        public List<Table> Tables { get; set; }
    }

The last code example below performs the deserialization and creates the SendGrid email message.  Insert this to the Run method after the HttpClient call we previously added.

                string result = response.Content.ReadAsStringAsync().Result;
                log.Info(result);

                RootObject aiResult = JsonConvert.DeserializeObject<RootObject>(result);

                string countString = aiResult.Tables[0].Rows[0][0].ToString();

                string recipientEmail = System.Environment.GetEnvironmentVariable($"recipient", EnvironmentVariableTarget.Process);
                string senderEmail = System.Environment.GetEnvironmentVariable($"sender", EnvironmentVariableTarget.Process);

                var messageContent = new Content("text/html", $"There were {countString} POC records found");

                message = new Mail(new Email(senderEmail), "App Insights POC", new Email(recipientEmail), messageContent);

Publish your solution to an Azure Function App by downloading the Function App’s profile and using the VS2017 projects publish options.  You will also need to define the application settings referred to in the code so that they are appropriate for you environment.  At that point you will be able to observe the results of you efforts.

Summary

This post demonstrates how a small amount of code can give you the ability to leverage Application Insights for more than just out of the box statistics alerts.  This approach is flexible enough to be use for report on types of errors and monitoring if subsystems are remaining available.  Combining the features within Azure’s cloud offerings gives you capabilities that would cost much more in development time and resource if they were done on premises. 

My only real problem with this approach is that I would prefer to be accessing values in the result by name rather than indexes because this makes the code less readable and more brittle to changes.

Try these examples out and see what other scenarios they apply to in your business.


          MS, Xbox 게임 스트리밍 서비스 '프로젝트 x클라우드' 발표      Cache   Translate Page      
마이크로소프트(이하 MS)가 엑스박스(Xbox) 게임을 다른 장치에서 스트리밍 플레이하는 '프로젝트 x클라우드(Project xCloud)'를 발표하였다. 킷구루(Kitguru)에 의하면 프로젝트 x클라우드는 전 세계 54개 지역에 분포 된 MS 애저(Azure) 서버를 기반으로 구동 된다. MS는 초기에 스마트폰과 태블릿 스트리밍에 집중하고 블루투스를 지원하는 엑스박스 컨트롤러나 터치 스크린으로 게임을 하는 기능을 준비 중이다. MS는.. 기사보기

          ML.NET 0.6 发布,微软的 .NET 跨平台机器学习框架      Cache   Translate Page      

ML.NET 0.6 已发布,ML.NET 是一个跨平台的开源机器学习框架,旨在让 .NET 开发者更快上手机器学习。

ML.NET 允许 .NET 开发者开发他们自己的模型,并将自定义 ML 注入到他们的应用程序中。他们无需开发或调整机器学习模型的专业知识,一切都可在 .NET 中搞定。

ML.NET 0.6 更新亮点:

  • 用于构建和使用机器学习模型的新 API

    ML.NET API 在该版本中进行首次迭代,旨在使机器学习更轻松、更强大。详情

  • 能够对预训练(pre-trained)的 ONNX 模型进行评分 详情

  • 模型预测性能改进

  • 其他改进:

    • improvements to ML.NET TensorFlow scoring

    • more consistency with the .NET type-system

    • having a model deployment suitable for serverless workloads like Azure Functions

更多细节请查阅发布公告:

https://blogs.msdn.microsoft.com/dotnet/2018/10/08/announcing-ml-net-0-6-machine-learning-net/


          DevOps Engineer (Azure) - 18209191 - CTG - Webster, NY      Cache   Translate Page      
Bachelor’s degree in Information Systems, Computer Science, or equivalent experience. W2 local candidates only!!!... $63 an hour
From CTG - Tue, 21 Aug 2018 22:12:27 GMT - View all Webster, NY jobs
          Enhanced Storage Management with Data Deduplication      Cache   Translate Page      

Date: Thursday, November 01, 2018

Time: 12:00 PM Eastern Daylight Time

Duration: 1 hour

Keeping up with today’s ever-growing storage demands is one of the IT administrator’s biggest challenges. According to IDC, data is doubling every year, and it will continue to do so for the foreseeable future. In this webcast, Michael Otey, senior contributing editor for IT Pro Today, will discuss some of today’s main storage challenges and the technologies that businesses use to address them.

We’ll also dig into deduplication which is one of the most powerful technologies for managing data growth. You’ll learn about the different types of deduplication solutions and the pros and cons of each approach.

Adrian Moir of Quest will show you how QoreStor 5.0’s software-defined secondary storage platform can help you solve these storage problems -- without expensive backup appliances. You’ll see how QoreStor’s advanced deduplication can reduce your backup and storage requirements by an average of 20:1, speeding up your backups as well as providing cloud integration.

Register Now!

If you have already registered, click here to access

Speakers:


Enhanced Storage Management with Data Deduplication
Michael Otey is a senior contributing editor for ITPro Today and is president of TECA a technical writing, content creation, software-development and consulting company in Portland, Oregon. Michael is a former SQL Server Microsoft MVP. He covers data center, SQL Server, windows Server, virtualization, hardware, storage, Azure, the hybrid cloud, systems management, VMware vSphere, containers, and PowerShell.
Enhanced Storage Management with Data Deduplication
Adrian Moir , Sr. Consultant, Product Management for Quest Data Protection.

With a background in electrical and electronic engineering, has over 30 years’ experience working in the IT industry throughout both corporate and channel based businesses. In recent years before Quest Software (formally Dell Software), architecting and delivering web centric digital asset management solutions for the pre-media market place. More recently, as the EMEA Pre-Sales Manager for the Dell Software Data protection team and EMEA Technical director for BakBone Software, ensuring delivery of presales activities throughout the region. Adrian currently works within the Data Protection product team at Quest and continues to drive the technology portfolio and work as a product evangelist.


          Episode 295: Shift+F10 and Done | TechSNAP 295      Cache   Translate Page      

A researcher accidentally roots Microsoft Azure’s Red Hat Update Infrastructure, newly discovered router flaw in-the-wild & hacking Windows 10 by holding down the shift key.

Plus your questions, our answers & a great round up!

#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000
          Technical Account Manager - Snowflake Computing - Alwal, Hyderabad, Telangana      Cache   Translate Page      
Amazon AWS, Microsoft Azure, OpenStack). Snowflake’s mission is to enable every organization to be data-driven with instant elasticity, secure data sharing and...
From Snowflake Computing - Tue, 11 Sep 2018 22:11:33 GMT - View all Alwal, Hyderabad, Telangana jobs
          Sr. Solution Architect - Snowflake Computing - Alwal, Hyderabad, Telangana      Cache   Translate Page      
Amazon AWS, Microsoft Azure, OpenStack, etc.). Snowflake’s mission is to enable every organization to be data-driven with instant elasticity, secure data...
From Snowflake Computing - Tue, 11 Sep 2018 22:11:32 GMT - View all Alwal, Hyderabad, Telangana jobs
          Partner Solutions Architect - Snowflake Computing - Alwal, Hyderabad, Telangana      Cache   Translate Page      
Amazon AWS, Microsoft Azure, OpenStack, etc.). Snowflake’s mission is to enable every organization to be data-driven with instant elasticity, secure data...
From Snowflake Computing - Fri, 31 Aug 2018 04:11:36 GMT - View all Alwal, Hyderabad, Telangana jobs
          Technical Specilist - Brillio - Redmond, WA      Cache   Translate Page      
Strong experience in Azure ecosystem such as HDInsight, Azure Data Factory, Azure Data Lake and SQL DW. Job description would be as follows:....
From Brillio - Tue, 25 Sep 2018 23:59:03 GMT - View all Redmond, WA jobs
          MS BI Developer - Advantine Technologies - Redmond, WA      Cache   Translate Page      
Experience in Azure ecosystem such as HDInsight, Azure Data Factory, Azure Data Lake and SQL DW is OPTIONAL, but NOT mandatory Qualifications....
From Advantine Technologies - Wed, 19 Sep 2018 17:21:18 GMT - View all Redmond, WA jobs
          American Guide Series - Texas State Cemetery - Austin Texas      Cache   Translate Page      


The TEXAS STATE CEMETERY, Navasota St. between E. 7th and E. 11th Sts., extending to Comal St., is the burial place of many distinguished Texans. Stephen F. Austin's grave is surmounted bu a bronze statue by Pompeo Coppini. It occupies the highest knoll on the grounds. The reclining figure of General Albert Sidney Johnston was done by Elizabet Ney. Coppini did the bronze figure of Johanna Troutman, a Georgian, who made the Lone Star flag of white silk with an azure star that was brought to Texas by the Georgia Battalion in December, 1835. The grave of W.A. (Big Foot) Wallace, Indian scout and Texas Ranger, is also in the cemetery.

- Texas: A Guide to the Lone Star State, 1940 -- pg. 175





The Texas State Cemetery (TSC) is a cemetery located on about 22 acres (8.9 ha) just east of downtown Austin, the capital of the U.S. state of Texas. Originally the burial place of Edward Burleson, Texas Revolutionary general and Vice-President of the Republic of Texas, it was expanded into a Confederate cemetery during the Civil War. Later it was expanded again to include the graves and cenotaphs of prominent Texans and their spouses.

The cemetery is divided into two sections. The smaller one contains around 900 graves of prominent Texans, while the larger has over 2,000 marked graves of Confederate veterans and widows. There is room for 7,500 interments; the cemetery is about half full, after including plots chosen by people who are eligible for burial.

- Texas State Cemetery Wikipedia Entery



          Azure Sales and Marketing Engineer - Ingram Micro Cloud - Bellevue, WA      Cache   Translate Page      
Ingram Micro Inc. If so, join the Ingram Micro Cloud team - where rainmakers thrive. Act as a marketing resource and mentor for Ingram Micro Cloud sales teams...
From Ingram Micro Cloud - Fri, 28 Sep 2018 07:14:09 GMT - View all Bellevue, WA jobs
          Senior RTL Engineer - CSI / Azure - Cloud Server Infrastructure - Microsoft - Redmond, WA      Cache   Translate Page      
2+ experience in logic design, RTL, ASICs or FPGAs. Seeking an RTL engineer with prior experience in AI acceleration or hardware algorithms examples of which...
From Microsoft - Sun, 26 Aug 2018 09:42:43 GMT - View all Redmond, WA jobs
          Cloud Application Developer (Local to TX preferred)      Cache   Translate Page      
TX-Plano, Must To Have Skills: - They need to be familiar with cloud APIs such as OpenStack, Puppet, Chef, etc. - They also need to have some real-world experience in deploying and supporting an infrastructure in a cloud based on AWS, Azure, Google, CloudFoundry etc. - Development experience in C#, Java and Python. Required Skills: - General programming best practices. - Specific knowledge of C#, Java and H
          AWS SME/Architect      Cache   Translate Page      
TX-Plano, Plano, Texas Skills : • Looking For AWS SME/Architect • Other cloud knowledge including Cloud, Azure or VM ware Cloud will be an advantage. • Track record of implementing AWS services in a variety of distributed computing, enterprise environments Description : Location: Plano TX/ Marquette MI Job Title: AWS SME Duration: 06 to 12+ months (High possibilities of extension)
          Business Management Software For Your Gulf Business      Cache   Translate Page      

businessexpertsgulf posted a photo:

Business Management Software For Your Gulf Business

It is a part of Microsoft Business Solutions. It can be coupled extensively with other Microsoft applications like Microsoft Office, Skype, Outlook, and Azure. The best feature of Microsoft Dynamics is that it is scalable for small, medium and large businesses. That means your software solution will grow along with your Gulf Business.


          What are Durable Functions?      Cache   Translate Page      

Oh no! Not more jargon! What exactly does the term Durable Functions mean? Durable functions have to do with Serverless architectures. It’s an extension of Azure Functions that allow you to write stateful executions in a serverless environment.

Think of it this way. There are a few big benefits that people tend to focus on when they talk about Serverless Functions:

They’re cheap They scale with your needs (not necessarily, but that’s the default for many services) They allow you to write event-driven code

Let’s talk about that last one for a minute. When you can write event-driven code, you can break your operational needs down into smaller functions that essentially say: when this request comes in, run this code. You don’t mess around with infrastructure, that’s taken care of for you. It’s a pretty compelling concept.

In this paradigm, you can break your workflow down into smaller, reusable pieces which, in turn, can make them easier to maintain. This also allows you to focus on your business logic because you’re boiling things down to the simplest code you need run on your server.

So, here’s where Durable Functions come in. You can probably guess that you’re going to need more than one function to run as your application grows in size and has to maintain more states. And, in many cases, you’ll need to coordinate them and specify the order in which they should be run for them to be effective. It's worth mentioning at this point that Durable Functions are a pattern available only in Azure . Other services have variations on this theme. For example, the AWS version is called Step Functions. So, while we're talking about something specific to Azure, it applies more broadly as well.

Durable in action, some examples

Let’s say you’re selling airline tickets. You can imagine that as a person buys a ticket, we need to:

check for the availability of the ticket make a request to get the seat map get their mileage points if they’re a loyalty member give them a mobile notification if the payment comes through and they have an app installed/have requested notifications

(There’s typically more, but we’re using this as a base example)

Sometimes these will all run be run concurrently, sometimes not. For instance, let’s say they want to purchase the ticket with their mileage rewards. Then you’d have to first check the awards, and then the availability of the ticket. And then do some dark magic to make sure no customers, even data scientists, can actually understand the algorithm behind your rewards program.

Orchestrator functions

Whether you’re running these functions at the same moment, running them in order, or running them according to whether or not a condition is met, you probably want to use what’s called an orchestrator function . This is a special type of function that defines your workflows, doing, as you might expect, orchestrating the other functions. They automatically checkpoint their progress whenever a function awaits, which is extremely helpful for managing complex asynchronous code.

Without Durable Functions, you run into a problem of disorganization. Let’s say one function relies on another to fire. You could call the other function directly from the first, but whoever is maintaining the code would have to step into each individual function and keep in their mind how it’s being called while maintaining them separately if they need changes. It's pretty easy to get into something that resembles callback hell, and debugging can get really tricky.

Orchestrator functions, on the other hand, manage the state and timing of all the other functions. The orchestrator function will be kicked off by an orchestration trigger and supports both inputs and outputs . You can see how this would be quite handy! You’re managing the state in a comprehensive way all in one place. Plus, the serverless functions themselves can keep their jobs limited to what they need to execute, allowing them to be more reusable and less brittle.

Let’s go over some possible patterns. We’ll move beyond just chaining and talk about some other possibilities.

Pattern 1: Function chaining

This is the most straightforward implementation of all the patterns. It's literally one orchestrator controlling a few different steps. The orchestrator triggers a function, the function finishes, the orchestrator registers it, and then then next one fires, and so on. Here's a visualization of that in action:

See the Pen Durable Functions: Pattern #1- Chaining by Sarah Drasner ( @sdras ) on CodePen .

Here's a simple example of that pattern with a generator.

const df = require("durable-functions")
module.exports = df(function*(ctx) {
const x = yield ctx.df.callActivityAsync('fn1')
const y = yield ctx.df.callActivityAsync('fn2', x)
const z = yield ctx.df.callActivityAsync('fn3', y)
return yield ctx.df.callActivityAsync('fn3', z)
})

I love generators! If you're not familiar with them, check out this great talk by Bodil on the subject).

Pattern 2: Fan-out/fan-in

If you have to execute multiple functions in parallel and need to fire one more function based on the results, a fan-out/fan-in pattern might be your jam. We'll accumulate results returned from the functions from the first group of functions to be used in the last function.

See the Pen Durable Functions: Pattern #2, Fan Out, Fan In by Sarah Drasner ( @sdras ) on CodePen .

const df = require('durable-functions')
module.exports = df(function*(ctx) {
const tasks = []
// items to process concurrently, added to an array
const taskItems = yield ctx.df.callActivityAsync('fn1')
taskItems.forEach(item => tasks.push(ctx.df.callActivityAsync('fn2', item))
yield ctx.df.task.all(tasks)
// send results to last function for processing
yield ctx.df.callActivityAsync('fn3', tasks)
}) Pattern 3: Async HTTP APIs

It's also pretty common that you'll need to make a request to an API for an unknown amount of time. Many things like the distance and amount of requests processed can make the amount of time unknowable. There are situations that require some of this work to be done first, asynchronously, but in tandem, and then another function to be fired when the first few API calls are completed. Async/await is perfect for this task.

See the Pen Durable Functions: Pattern #3, Async HTTP APIs by Sarah Drasner ( @sdras ) on CodePen .

const df = require('durable-functions')
module.exports = df(async ctx => {
const fn1 = ctx.df.callActivityAsync('fn1')
const fn2 = ctx.df.callActivityAsync('fn2')
// the responses come in and wait for both to be resolved
await fn1
await fn2
// then this one this one is called
await ctx.df.callActivityAsync('fn3')
})

You can check out more patterns here ! (Minus animations. )

Getting started

If you'd like to play around with Durable Functions and learn more, there's a great tutorial here , with corresponding repos to fork and work with. I'm also working with a coworker on another post that will dive into one of these patterns that will be out soon!

Alternative patterns

Azure offers a pretty unique thing in Logic Apps , which allows you the ability to design workflows visually. I'm usually a code-only-no-WYSIWYG lady myself, but one of the compelling things about Logic Apps is that they have readymade connectors with services like Twilio and SendGrid, so that you don't have to write that slightly annoying, mostly boilerplate code. It can also integrate with your existing functions so you can abstract away just the parts connect to middle-tier systems and write the rest by hand, which can really help with productivity.


          Azure MLに「自動化された機械学習」が登場       Cache   Translate Page      

こんにちは、さとうなおきです。「週刊アジュール」では、基調講演での発表をまとめた「Ignite 2018」特別号外に続いて、Ignite 2018でのAzureアップデートを、インフラ編、アプリ開発編、データ編、AI/IoT編の4回に分けてお伝えします。今回はAI/IoT編です。



          Azure Cosmos DB、Azure SQL Databaseが機能強化       Cache   Translate Page      

こんにちは、さとうなおきです。「週刊アジュール」では、基調講演での発表をまとめた「Ignite 2018」特別号外に続いて、Ignite 2018でのAzureアップデートを、インフラ編、アプリ開発編、データ編、AI/IoT編の4回に分けてお伝えします。今回はデータ編です。



          Software Engineer II - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 04 Oct 2018 19:23:52 GMT - View all Redmond, WA jobs
          Principal Software Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 04 Oct 2018 19:23:52 GMT - View all Redmond, WA jobs
          Principal Program Manager - Microsoft - Redmond, WA      Cache   Translate Page      
Job Description: The Azure Big Data Team is looking for a Principal Program Manager to drive Azure and Office Compliance in the Big Data Analytics Services ...
From Microsoft - Sat, 28 Jul 2018 02:13:20 GMT - View all Redmond, WA jobs
          Cod VBCON48007 Apartamento En Venta En Bogota Mazurén      Cache   Translate Page      
550000000
Se veden apartamento en un cuarto piso con ascensor ubicado en una zona muy linda de Mazuren cuenta con muchos parques públicos muchas zonas verdes y excelentes vías de accesocerca de la Avenida Boyacá de Centros Comerciales como MultiDrive...
1 habitación 1 baño 85 m² 6.470.588 COP/m² elevador gimnasio garage cocina equipada
Sat, 06 Oct 2018 09:31:09 -0400
          Microsoft Azure      Cache   Translate Page      

businessexpertsgulf posted a photo:

Microsoft Azure

Get ahead in the business with all creativity and innovation at its peak. Get in touch with the leading experts in the computing world, Business Experts Gulf. We at Business Experts Gulf provide our customers with smart solutions in the world of the database, computing, analytics, networking, storage and much more. Microsoft Azure is one such cloud service with impressive storage size.
To get Microsoft Azure Solutions in Abu Dhabi or Microsoft Azure Solutions in Dubai, give us a call at +97144214909.


          Azure Développer - Technologie Delan - Montréal, QC      Cache   Translate Page      
Very good working knowledge of Azure Learning Machines / HD insight / Spark / Databricks or any other cloud environment;...
From Technologie Delan - Tue, 09 Oct 2018 17:09:06 GMT - View all Montréal, QC jobs
          Développeur Azure - Technologie Delan - Montréal, QC      Cache   Translate Page      
Le Développeur Azure sera responsable de participer à la conception, au déploiement et l’amélioration continue des applications de l’entreprise PLUS...
From Technologie Delan - Tue, 09 Oct 2018 17:09:05 GMT - View all Montréal, QC jobs
          Consultant MSBI Senior (H/F) – Montréal – Perm – Jusqu’à 100k CAD - Elitesoft - Montréal, QC      Cache   Translate Page      
Canada USA US Montréal Québec Azure Machine Learning Azure HD insights MSBI Microsoft BI Business Intelligence SSIS SSRS SSAS Power BI Cubes OLAP... $60,000 - $100,000 a year
From Indeed - Fri, 28 Sep 2018 14:24:37 GMT - View all Montréal, QC jobs
          Développeur Big Data - EXIA - Montréal, QC      Cache   Translate Page      
Posséder de l’expérience sur Microsoft Azure / HD Insight (Hortonworks) ; Vous êtes passionné par le Big Data, vous êtes familier avec les écosystèmes Pig, Hive...
From EXIA - Tue, 18 Sep 2018 11:29:35 GMT - View all Montréal, QC jobs
          [原]程序员入错行怎么办?      Cache   Translate Page      

640?wx_fmt=gif

程序员应该选择什么技术领域才能获得最高的回报?

本文详细解读了 2018 年最热门的五大领域,对行业现状、薪资概况及具体的技能要求给出了深入的分析,希望给担心“入错行”的你提供些指导。

640?wx_fmt=jpeg

七天国庆黄金周转眼就过,退散的除了出游热情,还有买房炒房的浪潮。

坊间来报,自 10 月 15 日起中国央行将下调部分金融机构存款准备金率,降准之外还会再释放增量资金约 7500 亿元——这次金融领域的大动作,对技术领域而言,开发者最直接的观感大抵就是备受诟病的房价要稳了。

“安土重迁,黎民之幸。”自古以来,房子都是人们安身立命的根本所在。而对于广大的开发者而言,买房也是绕不开的话题,但是高昂的房价之下各种“逃离北上广”、“逃离一线城市”的声音一直层出不穷。与房价高对等的,是开发者们“高薪多酬”、“996”、“压力大”、“不修边幅”等扯不掉的标签。

那么真实的开发者现状究竟是怎样的?

每年都有大量的开发者调查报告发布,报告的主体也不尽相同,从技术开发者的全局画像到细分领域的剖析解读应有尽有。下面我们就从大数据、云计算、AI、区块链、物联网这五个具体领域,结合最新的技术发展动态,给大家呈现出最为真实的中国开发者绘卷。

 

640?wx_fmt=png

水涨船高”的大数据开发者人才需求和薪资报酬

 

大数据时代,数据所蕴含的价值已毋庸置疑,在政府、企业、科研等领域都有其身影。事实上,它已经上升到了国家战略层面,中国、美国以及欧盟等国家都已经将大数据列入其中,微软、谷歌、百度以及亚马逊等科技巨头也紧跟其后,将大数据技术视为未来发展的重大筹码。

这点从黑客们“前仆后继”的信息窃取行径中也可见一斑。

仅这一年,就有多起大型数据泄漏事件发生:Facebook 上多达 5000 万的用户信息被泄露,用于操纵选民投票;WiFi 万能钥匙被爆窃取了 9 亿用户隐私,用于营销推广和谋取暴利;QQ 浏览器、百度手机输入法涉嫌私自调动摄像头、自动录音等侵权手段;A 站近千万条数据公开泄露;1.23 亿条华住旗下所有酒店的数据被泄漏和公开售卖......

由此可见,数据价值之高,大数据技术的重要性也不言而喻。

CSDN 2017 年调研数据显示,78% 的企业在进行大数据相关的开发和应用。虽然目前大约 57% 的企业对大数据的应用更多仍体现在统计分析、报表及数据可视化上,而且因为整体的大数据行业还不十分成熟,企业的需求定位尚不明确,因此深层次的应用还未普及也是情理之中了。但是这个比例与 2015 年、2016 年相比,已经有了非常大的提升。

640?wx_fmt=png

这种情况下,大数据开发者的人才需求和薪资报酬自然也是水涨船高。

640?wx_fmt=jpeg

根据中国商业联合会数据分析专业委员会统计表明,未来中国基础性数据分析人才的缺口将达到 1400 万,而在 BAT 等企业招聘的职位里,60% 以上都在招聘大数据人才。此外据领英报告显示,在大数据开发者的各个岗位中,数据分析人才的供给指数仅为 0.05,属于高度稀缺,数据分析人才跳槽速度也最快,平均跳槽速度为 19.8 个月......再以北京 2017 年的大数据开发者工资收入水平为例,五成以上的开发者月薪高于 30K,均薪可达 30230 元。

640?wx_fmt=jpeg

对于想要投身大数据抑或是身在坑底的开发者来说,最好的建议是找准一个切入点,比如平台搭建、ETL、离线处理程序、实时数据分析等,然后再往更大的领域扩充自己的知识储备——这样或许会让数据开发之路走得更稳。

 

640?wx_fmt=png

44% 的人认为数据库管理是收入最高的云计算技能

 

2017 年发布的 Gartner 技术成熟度曲线中,云计算已经不在“新兴技术”之列,转而进入到快速发展的车道了。2006 年 3 月,亚马逊推出第一个云计算服务的时候外界并不看好,但是随着云计算步入第二个发展 10 年,全球云计算的市场已经趋于稳定增长,逐渐远离单纯的“虚拟化或是网络服务”,成为了独立、成型以及普及度较高的 IT 基础设施服务。

640?wx_fmt=jpeg

容器、微服务、DevOps 等技术在不断推动着云计算的变革,科技巨头们也相继把云计算提到了战略的高度:亚马逊、谷歌、微软以及阿里云、腾讯云等疯狂地兴建数据中心,彼此之间也在围绕着客户“融合”,比如 Instagram 从亚马逊 AWS 迁移到 Facebook 的自有平台,Zynga 从自有平台迁移到亚马逊 AWS,苹果公司为了分摊风险将一部分业务从 AWS 分散到 GoogleCloud,Verizon 抛弃微软 office 回归谷歌 G Suite……

这些动作都在表明云计算的边界日益模糊,业务上的深度融合似乎是大势所趋。与此同时,中国的云计算市场也处于高速增长的阶段。CSDN 调研数据显示,有 83% 的企业正在使用云服务,仅有不到 1 成的企业对云计算不甚关注。在具体应用上,企业在虚拟机、网络存储、负载均衡三方面的应用较为普遍,基于 Docker 或 OpenStack 是当前云平台部署的两种主流框架。

640?wx_fmt=png

而从云计算开发者的角度而言,随着企业将基础设施迁移到公有云中,对云计算技术的专业人员需求将不断加大。Rackspace 去年发布了“云成本的专业知识”研究报告,该调查与伦敦证券交易所学者和 Vanson Bourne 合作。

调查发现,近四分之三的 IT 决策者(71%)认为他们的组织由于缺乏云技术而失去了收入,占全球云计算收入的 5%。报告指出,由于人才缺口巨大,IT 团队需要花费五周的时间才能完成招聘任务。

那么云计算开发者的哪些技能最受欢迎?Rackspace 调查的受访者确定了一些企业迫切需求的云计算技能:数据库管理,44% 的人表示这是收入最高的云计算技能,24% 的人认为这是最难招到相应人才的职位;云安全,业界不断发生的数据泄露事件不断增加云安全专业人才的需求;服务管理,涉及供应、监控和编排组织对云工具的使用;项目迁移管理,36% 的受访者认为这是极难招到合适的掌握该技能的人才;自动化,随着越来越多的组织采用 DevOps 的方式,越来越多的企业正在使用自动化工具来处理云端和内部数据中心基础设施的日常配置和管理任务;此外,云原生应用开发、Microsoft Azure、测试、DevOps 等相关的技术人才也逐渐受到追捧。

不过,安全问题仍是云服务最大的顾虑所在。在互联网系云计算服务商中,阿里、腾讯等巨头正在大力投入安全领域,其他玩家能否跟进还尚不可知。

 

 

640?wx_fmt=png

AI 软件工程师和算法工程师是最受欢迎的岗位

 

据麦肯锡 2017 年发布的《人工智能,下一个数字前沿》报告显示,机器人和语音识别作为最受欢迎的两大投资领域,已经吸纳了全球科技巨头们高达 200 亿至 300 亿美元不等的巨额资本——最近一年的 AI 炒得尤其火热。

以 BAT 等互联网公司为例,百度作为首家号称“All in AI”的科技公司,一直专注于对话式人工智能系统 DuerOS 和自动驾驶系统 Apollo 平台;阿里巴巴也在全面布局 AI 生态,疯狂投资 AI 初创公司,而且还蓄力智能云、AI 芯片等技术;起步较晚的腾讯同样不甘示弱,不仅成立了 AI Lab,还网罗了大量人工智能专家,积极推动语音识别、人脸识别等技术内部产品化......

而近来搅翻了国内搜索市场的 Google,也在上个月的上海开发者大会上将人工智能贯穿始终,从 Android 到智能穿戴,从 TensorFlow 到 AR 应用,要么构筑底层生态,要么引领技术潮流,不一而足。

CSDN 调研显示,虽然当前国内 AI 的普及率还偏低,但发展潜力巨大,只有 25% 的开发者表示完全没有人用过。

640?wx_fmt=png

而据猎聘大数据研究院近期发布的问卷调研中发现,AI 核心职能对学历要求明显增高,AI 人才主要分布在北京、上海和深圳这三个一线城市,在行业方面,AI 人才的分布以互联网为主,但也向其它行业逐渐渗透。在 Top 10 核心职能上,AI 软件工程师和算法工程师遥遥领先,是最紧俏的职能岗位。

640?wx_fmt=jpeg

此外,根据美国知名研究机构 CB Insights 最新发布的《2018 年必看的人工智能热门趋势》(Top AI Trends To Watch In 2018)显示,通过对 AI 行业发展现状进行了深入的研究剖析,人工智能的薪资水准已明显超越前后端开发、移动开发等岗位。

640?wx_fmt=other

而据普华永道发布的一份报告显示,随着人工智能扩展到更为具体的领域,它将需要数据科学家和人工智能专家通常缺乏的各领域的专业知识和技能。未来,对于 AI 开发者而言,更为全方位的技术储备是必不可少的。

640?wx_fmt=jpeg

 

640?wx_fmt=png

区块链技术开发者仍热情高涨

 

近年来“跌宕起伏”的区块链市场,也将区块链技术应用带到了大众眼前。

据摩根士丹利的研究报告显示,“比特币的价格上涨速度,大约是纳斯达克综合指数的 15 倍。”2017-2018 年比特币的价格走势,和 1998 年前的互联网泡沫期间的纳斯达克综合指数走势很像,但是速度要快得多,摩根士丹利分析师认为,这“预示着纳斯达克的历史正在重复上演”。

640?wx_fmt=jpeg

但是各种“泡沫”的质疑声下,开发者学习区块链技术的热情依旧高涨。

据 CodeMentor 发起的“ 区块链开发生态现状调查”研究显示,虽然有 46% 的受访者表示他们没有计划在短期内(未来三个月内)学习区块链这项新技术,但计划在未来数月内开始学习区块链技术的开发者占到九成之多。

在薪酬方面,BOSS 直聘数据显示,2018 年第一季度,区块链技术岗位平均招聘薪酬增长 31%,打败了其他所有岗位。“但区块链人才池太小,挖人很难。挖一个区块链的人,要付出 200% 的努力。”为了招人,各家公司也使尽了浑身解数——但绝大多数的从业者都是不合格的,要成为区块链的技术精英,不仅要懂计算机、编程语言,还要对经济学和博弈论有深刻理解,人才的严重短缺或许也是区块链市场泡沫形成的一大诱因吧。

此外,区块链当前的应用仍相对较少。CSDN 调查显示,正在用或者准备用区块链技术解决技术问题的人群仅占受访者的 10%,有 20% 的人对区块链完全不了解。缺少开发经验、技术资料以及落地的应用和场景是当前区块链开发的主要挑战,此次调研中分别占 56%、54%、50%。

640?wx_fmt=png

 

640?wx_fmt=png

优秀的物联网人才“供”远小于“求”

 

从智能家居到医疗监控,从可穿戴设备到能源供给,物联网已经成为了我们生活中不可分割的主要部分,国内外科技巨头也竞相布局物联网。

今年年初阿里就曾表示,loT 是集团继电商、金融、物流、云计算之后的一条新的主赛道,并提出 5 年要完成 100 亿设备连接的目标;百度也推出了百度云天工智能物联网平台;华为推动 NB-IoT 标准制定,并发布了物联网操作系统 LiteOS、NB-IoT 端到解决方案;腾讯推出“QQ 物联智能硬件开放平台”,将 QQ 账号体系及关链、QQ 消息通道等核心能力提供给可穿戴设备、智能家居、智能车载、传统硬件等领域合作伙伴,实现用户与设备及设备与设备之间的互联互通互动......

但是据 Eclipse IoT 发布的《2018 年物联网开发者调查报告》显示,企业开发物联网解决方案的增长率仅为 5.8%。不过虽然增长缓慢,但也透露出物联网企业们正在摆脱理论领域,更多地将理论付诸于实践。

640?wx_fmt=jpeg

这其中,物联网的构建难度之高不得不提。在物联网中,组网、人机交互、数据、安全特性等技术碎片化太过严重,因此它不单单是纯软件的开发,还需要掌握硬件的嵌入式等技能。这种背景下,物联网开发者的热度也自然很高。仅从国内某知名招聘平台上,就可以发现物联网工程师平均就业薪资可以达到 15K/月,且全网的招聘需求高达 14000+ 条。

640?wx_fmt=png

此外,作为国家倡导的新兴战略性产业,物联网备受各界重视,并成为就业前景广阔的热门领域。自 2011 年以来,全国各地高校纷纷设立物联网专业,物联网工程导论、嵌入式系统与单片机、无线传感器网络与 RFID 技术、物联网技术及应用、云计算与物联网、物联网安全、物联网体系结构及综合实训、信号与系统概论、现代传感器技术等课程以及多种选修课。

对于物联网开发者本身而言,则建议在学习时找准物联网的角度,深入学习,掌握知识和项目实战技能才是重中之重。

 

640?wx_fmt=png

我们真实的开发者究竟是什么样子的?

 

代码改变世界,开发者所创造的技术世界正给我们的生活带来革命性的变化。以上的五大领域开发者现状描摹也只是技术更迭下的时代缩影,在快速发展的当下,我们的开发者画像又会呈现出怎样的趋势变化呢?

自 2004 年开始,CSDN 通过对开发人员、开发技术以及开发工具、平台的状况和发展趋势等进行深入的调研,为各相关行业提供了中国软件开发者群体以及软件开发服务领域市场所提供的重要参考资料。迄今为止,已有数以万计的开发者参与其中,共同绘就了真实的中国开发者画像。

而现在,2018 年 CSDN 软件开发者大调查活动已经正式启动了!作为技术开发社区的一份子,我们诚邀你加入我们的大调查活动。

现在扫描以下二维码即可参与:

640?wx_fmt=png

此外,我们还为你准备了精美的礼品,华为 nova3 智能手机、小爱智能音箱、CSDN 背包、CSDN 定制T恤、数百本技术图书等你来拿!参与即有机会获赠,还等什么,快来试试吧!

640?wx_fmt=png

点击下方的“阅读原文”或复制官网链接(https://www.csdn.net/2018dev/)至浏览器访问,也可立即参与。

 

640?wx_fmt=jpeg

 

微信改版了,

想快速看到CSDN的热乎文章,

赶快把CSDN公众号设为星标吧,

打开公众号,点击“设为星标”就可以啦!

640?wx_fmt=png


“征稿啦”

CSDN 公众号秉持着「与千万技术人共成长」理念,不仅以「极客头条」、「畅言」栏目在第一时间以技术人的独特视角描述技术人关心的行业焦点事件,更有「技术头条」专栏,深度解读行业内的热门技术与场景应用,让所有的开发者紧跟技术潮流,保持警醒的技术嗅觉,对行业趋势、技术有更为全面的认知。

如果你有优质的文章,或是行业热点事件、技术趋势的真知灼见,或是深度的应用实践、场景方案等的新见解,欢迎联系 CSDN 投稿,联系方式:微信(guorui_1118,请备注投稿+姓名+公司职位),邮箱(guorui@csdn.net)。

 

推荐阅读:

640?wx_fmt=gif

640?wx_fmt=gif

作者:csdnnews 发表于 2018/10/08 22:17:44 原文链接 https://blog.csdn.net/csdnnews/article/details/82975989
阅读:1227

          Cloud Engineer      Cache   Translate Page      
VA-Vienna, CLOUD ENGINEER - CONTRACT- VIENNA, VA The end client is unable to sponsor or transfer visas for this position; all parties authorized to work in the US without sponsorship are encouraged to apply. Cloud Engineer Skills & Requirements: * Must have: 3+ years' working experience with Azure IaaS and PaaS * 5+ years of software engineering and development, including requirement definition, software dev
          Software Developer - Jobline Resources Pte Ltd - Paya Lebar      Cache   Translate Page      
Familiar with cloud environment – AWS, Azure, Google Cloud, principals of, and execution in, DevOps. The development environment consists of AWS Cloud and...
From Jobline Resources Pte Ltd - Wed, 05 Sep 2018 09:35:06 GMT - View all Paya Lebar jobs
          DevOps Engineer - Ethos BC Global - Singapore      Cache   Translate Page      
Public clouds such as AWS, Google Cloud or Azure. Financial and emerging markets....
From Ethos BC Asia - Tue, 09 Oct 2018 16:25:17 GMT - View all Singapore jobs
          Evaluation and Management of First-Trimester Bleeding      Cache   Translate Page      
imageNo abstract available
          Postterm Pregnancy Part II: Prevention and Management      Cache   Translate Page      
imageNo abstract available
          Diagnosis and Management of Endocrine-Active and Nonepithelial Ovarian Tumors      Cache   Translate Page      
imageNo abstract available
          Laparoscopic Myomectomy      Cache   Translate Page      
imageNo abstract available
          Inherited Thrombophilia in Pregnancy      Cache   Translate Page      
imageLearning Objectives: After reading this issue, the participant should be able to: 1. Identify the most common thrombophilias and describe their relationship to adverse pregnancy outcomes, including venous thromboembolism, intrauterine growth retardation, placental abruption, preeclampsia, stillbirth, and miscarriage.2. Describe the treatment of thrombophilia in pregnancy.3. Explain the indications for testing for thrombophilia.
          Ultrasound Diagnosis: Normal and Abnormal Early Pregnancy      Cache   Translate Page      
imageNo abstract available
          Thyroid Disease in Pregnancy      Cache   Translate Page      
imageNo abstract available
          Thyroid Disease in Pregnancy      Cache   Translate Page      
imageNo abstract available
          Magnesium Sulfate Neuroprotection: Time to Start?      Cache   Translate Page      
imageNo abstract available
          Cesarean Section: Is There a Right Technique?      Cache   Translate Page      
imageNo abstract available
          Diagnosis and Management of Septic Abortion      Cache   Translate Page      
imageNo abstract available
          Life Expectancy Following Rehabilitation: A NIDRR Traumatic Brain Injury Model Systems Study      Cache   Translate Page      
imageObjective: To characterize overall and cause-specific mortality and life expectancy among persons who have completed inpatient traumatic brain injury rehabilitation and to assess risk factors for mortality. Design: Prospective cohort study. Setting: The Traumatic Brain Injury Model Systems. Participants: A total of 8573 individuals injured between 1988 and 2009, with survival status per December 31, 2009, determined. Interventions: Not applicable. Main Outcome Measures: Standardized mortality ratio (SMR), life expectancy, cause of death. Results: SMR was 2.25 overall and was significantly elevated for all age groups, both sexes, all race/ethnic groups (except Native Americans), and all injury severity groups. SMR decreased as survival time increased but remained elevated even after 10 years postinjury. SMR was elevated for all cause-of-death categories but especially so for seizures, aspiration pneumonia, sepsis, accidental poisonings, and falls. Life expectancy was shortened an average of 6.7 years. Multivariate Cox regression showed age at injury, sex, race/ethnic group, marital status and employment status at the time of injury year of injury, preinjury drug use, days unconscious, functional independence and disability on rehabilitation discharge, and comorbid spinal cord injury to be independent risk factors for death. Conclusion: There is an increased risk of death after moderate or severe traumatic brain injury. Risk factors and causes of death have been identified that may be amenable to intervention.
          Internet of Things: Die Pain Points bei IoT-Projekten      Cache   Translate Page      
IoT-Projekte dauern oft länger als erwartet. Anbieter wie AWS und Microsoft Azure verstehen nicht immer, wo die neuralgischen Punkte ihrer Kunden liegen, so der Berater Bain. Die Consultants nennen drei Tipps.
          Parga - Greece      Cache   Translate Page      
Parga Greece, the beautiful “Bride of Ipirus”
Πάργα, η όμορφη «Νύφη της Ηπείρου»
Parga is a coastal town and is amphitheatrically built. It was originally built on top of the mountain "Pezovolo" and up the coast of the Ionian Sea.
Η Πάργα είναι παραθαλάσσια κωμόπολη κτισμένη αμφιθεατρικά. Χτίστηκε αρχικά στην κορυφή του βουνού "Πεζόβολο" και έως τα παράλια του Ιονίου Πελάγους.
Parga is a picturesque resort situated between the coastal region of Preveza and Igoumenitsa and combines uniquely mountain and sea.
Η Πάργα είναι ένα γραφικό θέρετρο που βρίσκεται μεταξύ της παραθαλάσσιας περιοχής της Πρέβεζας και Ηγουμενίτσας και συνδυάζει μοναδικά βουνό και θάλασσα.
The pretty coastal town is wrapped in pine-clad mountains. Behind the harbour lies a picture-postcard jumble of terracotta-roofed houses.
Η όμορφη παραθαλάσσια πόλη τυλίγεται σε πευκόφυτα βουνά. Πίσω από το λιμάνι βρίσκονται σπίτια με βαμμένες στέγες από τερακότα, που φαίνονται σας καρτ ποστάλ.
The town blends Eastern and Western influences, which can be seen in the wonderful architecture, such as the Venetian-built castle of Parga, which overlooks the town.
Η πόλη συνδυάζει Ανατολικές και Δυτικές επιρροές, πράγμα το οποίο φαίνεται στην υπέροχη αρχιτεκτονική, όπως το βενετσιάνικο κάστρο που χτίστηκε στην Πάργα, με θέα την πόλη.
It is a resort town known for its scenic beauty. Lygia & Gregoris Maliotis enjoy incretible views of the city from above!
Είναι μια πόλη θέρετρο γνωστή για τη φυσική της ομορφιά. Η Λυγία & Γρηγόρης Μαλιώτης απολαμβάνουν απίστευτη θέα της πόλης από ψηλά!
The beautiful Parga with its long history, its diverse natural beauty and hospitality of its inhabitants.
Η πανέμορφη Πάργα, με τη μακραίωνη ιστορία της, τις ποικίλες φυσικές ομορφιές και τη φιλοξενία των κατοίκων της.

The idyllic town of Parga, surrounded by mountains and picturesque olive groves, is perfect holiday for Popi & Phivos Nicolaides.
Η ειδυλλιακή πόλη της Πάργας, περιτριγυρισμένη από βουνά και γραφικούς ελαιώνες, είναι ιδανικό μέρος διακοπές, της Πόπη & Φοίβου Νικολαΐδη.
The colorful houses of Parga Greece are constructed amphitheatrically along the slopes of a mountain, offering nice view to the sea.
Τα πολύχρωμα σπίτια της Πάργας είναι χτισμένα αμφιθεατρικά στις πλαγιές ενός βουνού, προσφέροντας υπέροχη θέα στη θάλασσα.
Lygia & Gregoris Maliotis happy to be in Parga, a touristy little town of a little more than 2,500 permanent residents.
Η Λυγία & Γρηγόρης Μαλιώτης, χαρούμενοι που βρίσκονται στην Πάργα, μια τουριστική κωμόπολη με περίπου 2.500 μόνιμους κατοίκους.
A variety of colourful boats are crossing the sea creating amazing scenery. 
Μια ποικιλία από πολύχρωμα πλεούμενα διασχίζουν τη θάλασσα και δημιουργούν υπέροχες εικόνες.

One of the most picturesque and cosmopolitan places in northwestern Greece, the “Bride of Ipirus”, the beautiful Parga.
Ένα από τα πιο γραφικά και κοσμοπολίτικα μέρη της βορειοδυτικής Ελλάδας, η «Νύφη της Ηπείρου», η πανέμορφη Πάργα.
Parga is one of the most famous tourist destinations in Greece since the end of '50s. 
Η Πάργα είναι ένας από τους πιο δημοφιλείς τουριστικούς προορισμούς στην Ελλάδα από το τέλος της δεκαετίας του '50.
On top of the hill above the port are the ruins of an old Venetian Castle. Great views for Popi Nicolaides. 
Στην κορυφή του λόφου πάνω από το λιμάνι βρίσκονται τα ερείπια ενός παλιού Ενετικού Κάστρου. Υπέροχη θέα για την Πόπη Νικολαΐδου.
Parga is a lovely town with vivid island style. Constructed along the slopes of a hill, it is surrounded by lush greenery and blue sea.
Η Πάργα είναι μια πανέμορφη πόλη με έντονο νησιώτικο στυλ. Χτισμένη στις πλαγιές ενός λόφου, που περιβάλλεται από ένα καταπράσινο τοπίο και τα καταγάλανα νερά, της θάλασσας.
Lygia & Gregoris Maliotis strolling around the port and enjoy a create a picturesque atmosphere.
Η Λυγία & Γρηγόρης Μαλιώτης βολτάρουν γύρω από το λιμάνι και χαίρονται τη γραφική ατμόσφαιρα.

Although it was a short visit, it was relaxing and refreshing and we really enjoyed our stay in Parga.
Αν και ήταν σύντομη η επίσκεψη, ήταν χαλαρή και αναζωογονητική η παραμονή μας στην Πάργα. 
The picturesque island of Panagia off the coast of Parga.
Το γραφικό νησάκι, της Παναγίας στα ανοικτά των ακτών της Πάργας.
Almost one hundred small churches and chaperls scattered around the town and hills around Parga.
Εκατόν σχεδόν ξωκλήσια είναι διάσπαρτα σε όλη την πόλη και στους λόφους γύρω από την Πάργα.
Lygia & Gregoris Maliotis enjoying the stunning scenery with its pretty sea, hills and great coastline.
Η Λυγία & Γρηγόρης Μαλιώτης απολαμβάνουν το εκπληκτικό τοπίο με την όμορφη θάλασσα, τους λόφους και την πανέμορφη ακτή.
Parga is on the northwest mainland of Greece and is beloved for its gorgeous beaches and laid-back atmosphere.
Η Πάργα βρίσκεται στη βορειοδυτική ηπειρωτική Ελλάδα και είναι αγαπημένη για τις υπέροχες παραλίες της και τη χαλαρή της ατμόσφαιρα της.
In Parga Town, the harbour ripples with life during the day, as little boats ferry in and out.
Στην Πάργα το λιμάνι είναι γεμάτο κυματισμούς όλη τη μέρα από τα μικρά πλεούμενα που μπαινοβγαίνουν συνέχεια.

Discovering Greek ancient and modern world. Living your myth in Greece....
Ανακαλύπτοντας τον αρχαίο και σύγχρονο ελληνικό κόσμο. Ζώντας το μύθο σου στην Ελλάδα...
Parga in Greece is the most popular summer destination in Epirus. This small town is situated in a secluded bay of the Ionian Sea and has an intense island feeling.
Η Πάργα είναι ο πιο δημοφιλής, θερινός προορισμός στην Ήπειρο. Αυτή η μικρή πόλη βρίσκεται σε έναν απομονωμένο κόλπο του Ιονίου και έχει μια έντονη αίσθηση νησιώτικου χώρου.

Popi & Phivos Nicolaides enjoy magnificent scenery and stunning views and try to discover Mother Nature’s secret in Parga!
Η Πόπη & Φοίβος Νικολαΐδης απολαμβάνουν το μαγευτικό τοπίο και την εκπληκτική θέα, προσπαθώντας, να ανακαλύψουν το μυστικό της μητέρας φύσης στην Πάργα!
Across the bay is the small island of Virgin Mary. A green islet with a whitewashed chapel stands at the entrance of the port.
Απέναντι από την προκυμαία βρίσκεται το νησάκι της Παναγιάς. Ένα καταπράσινο νησάκι με ένα ασβεστωμένο ξωκλήσι στην είσοδο του λιμανιού.

Pleasant walks full of surprises in the beautiful narrow streets of the old town!
Ευχάριστες βόλτες γεμάτες εκπλήξεις στα όμορφα στενά δρομάκια της παλιάς πόλης!
What makes Parga special, is undoubtedly the gorgeous nature. Great pictures and fun for Lygia & Gregoris Maliotis!
Αναμφίβολα, αυτό που κάνει ξεχωριστή την Πάργα, είναι η πανέμορφη φύση. Ωραίες φωτογραφίες και διασκέδαση για τη Λυγία & Γρηγόρη Μαλιώτη!

If you dive into the labyrinth of streets behind the harbour, you’ll stumble on jewellery makers that are known throughout Greece.
Αν βουτήξετε στο λαβύρινθο των δρομίσκων πίσω από το λιμάνι, θα σταματήσετε σε κατασκευαστές κοσμημάτων, που είναι γνωστοί σε όλη την Ελλάδα.
Lygia & Gregoris Maliotis smiling and looking happy by the beauty of the old town.
Η Λυγία & Γρηγόρης Μαλιώτης ποζάρουν χαμογελαστοί και ικανοποιημένοι από την ομορφιά της παλιάς πόλης.
Take time to walk up to the castle you can get some brilliant views of the harbour and the little island from there.
Πάρτε λίγο χρόνο, για να περπατήσετε μέχρι το κάστρο. Θα ανταμειφθείτε με μια εξαιρετική θέα του λιμανιού και του μικρού νησιού από εκεί.

There are many interest cozy shops on the narrow alley leading from the town center up the hill towards the Fortress.
Υπάρχουν πολλά όμορφα καταστήματα με ενδιαφέροντα είδη στο στενό δρομάκι που οδηγεί από το κέντρο της πόλης μέχρι το λόφο προς το Φρούριο.
Strong history, rich culture, beautiful landscape, amazing natural beauty, unique atmosphere, all together.
Δυνατή ιστορία, πλούσια παράδοση, υπέροχα τοπία, απίθανη φυσική ομορφιά, μοναδική ατμόσφαιρα, όλα μαζεμένα.
Lygia & Gregoris Maliotis at the small fishing village in the North of Greece, which gradually became a famous summer resort.
Η Λυγία & Γρηγόρης Μαλιώτης στην Πάργα, το μικρό ψαροχώρι, που σταδιακά έγινε ένα φημισμένο τουριστικό μέρος.
 
A big part of Parga Town’s appeal is its beaches. There’s Kryoneri, the busy town beach that faces Panagia Island.
Ένα μεγάλο μέρος της ελκυστικότητας της Πάργας είναι οι παραλίες της. Υπάρχει το Κρυονέρι, η πολυσύχναστη παραλία της πόλης απέναντι με το εκκλησάκι της Παναγιάς.
It is impossible not being fascinated by the charm of the city and the surroundings!
Είναι αδύνατο, να μη γοητευτείτε από τη ομορφιά της πόλης και ότι την περιβάλλει!
Krioneri Beach is the main beach of Parga and is located within the boundaries of the community, a short walk from the center and the waterfront.
Η παραλία Κρυονέρι είναι η κύρια παραλία της Πάργας και βρίσκεται μέσα στα όρια της κοινότητας σε μικρή απόσταση με τα πόδια από το κέντρο και την προκυμαία. 
Parga attracts thousands of tourists every summer due to its natural attractions such as its beaches. The most popular beaches are: Valtos, Kryoneri, Piso Kryoneri, Lichnos, Sarakiniko and Ai Giannaki.
Η Πάργα προσελκύει χιλιάδες τουρίστες κάθε καλοκαίρι λόγω της φυσικής της ομορφιάς, όπως είναι οι παραλίες. Οι πιο δημοφιλείς παραλίες του νησιού είναι: Βάλτος, Κρυονέρι, Πίσω Κρυονέρι, Λύχνος, Σαρακίνικο και Αϊ Γιαννάκη.

Many beaches, bays and small islands lay along the long and beautiful coast of Parga.
Πολλές παραλίες, όρμοι και τα μικρά νησιά βρίσκονται κατά μήκος της μακράς και όμορφης ακτής της Πάργας.
Valtos Beach is one of the longest beaches of Parga with a coastline that approaches 3 km. It is located just under the castle of Parga.
Η παραλία Βάλτος είναι μια από τις μεγαλύτερες παραλίες της Πάργας με ακτογραμμή που πλησιάζει τα 3 χιλιόμετρα.

The town is blessed with endless beauty - think sparkling clear azure waters with soft golden sand, all captivating in the Greek sunshine!
Η πόλη είναι ευλογημένη με ατέλειωτε ομορφιές. Φανταστείτε πεντακάθαρα γαλανά νερά με απαλή χρυσή άμμο, όλα σαγηνευτικά κάτω από τον Ελληνικό ήλιο!
The combination of the blue of Ionian Sea with the green is unique. Parga Beaches are among the best beaches in Greece with cool crystal waters.
Ο συνδυασμός του γαλάζιου του Ιονίου με το πράσινο είναι μοναδικός. Οι παραλίες της Πάργας είναι από τις καλύτερες παραλίες στην Ελλάδα, με κρυστάλλινα δροσερά νερά.
At night the waterfront cranks things up a notch, as music bars, tavernas and coffee shops, fill up with a cosmopolitan mix of holidaymakers.
Το βράδυ η προκυμαία γίνεται πολύβουη από τη η μουσική και τα μπαρ, οι ταβέρνες και οι καφετέριες γεμίζουν με ένα κοσμοπολίτικο μείγμα παραθεριστών.
Since ancient times, garlic has been used, among other things, against evil demons, as well as against the evil eye “To Mati”. Hence the garlic on the Greek flag's web! 
Από τα αρχαία χρόνια, το σκόρδο το χρησιμοποιούσαν μεταξύ άλλων και ενάντια στα κακά δαιμόνια, καθώς και κατά του ματιάσματος. Εξ' ού και τα σκόρδα στον ιστό της σημαίας!
Popi and Phivos Nicolaides and Evgenia and Petros Alexandridis, looking excited by their visit to amazing Parga!
Η Πόπη και Φοίβος Νικολαΐδης και η Ευγενία και ο Πέτρος Αλεξανδρίδης, γοητευμένοι από την επίσκεψη τους στην καταπληκτική Πάργα!



          Microsoft unveils Project xCloud streaming platform      Cache   Translate Page      
Installs Xbox One hardware into Azure DCs.

          The Impact of Advanced Practice Nurses' Shift Length and Fatigue on Patient Safety: Position Statement #3057      Cache   Translate Page      
imageNo abstract available
          Comment on URL Authorization Rules by Oded Dvoskin      Cache   Translate Page      
Please repost your question on the MSDN App Service forum so our engineers can take a look and help: https://social.msdn.microsoft.com/Forums/en-US/home?forum=windowsazurewebsitespreview
          Data Developer - HARMAN CONNECTED SERVICES ENGINEERING CORP. - Redmond, WA      Cache   Translate Page      
Build/manage ETL using SSIS, Azure Data Factory and Azure Logic Apps: Including automated monitoring & alert Manage an SQL Azure Database: Admin, authenticate ...
From Dice - Tue, 25 Sep 2018 08:04:03 GMT - View all Redmond, WA jobs
          Microsoft Security Update Minor Revisions      Cache   Translate Page      

Posted by Microsoft on Oct 09

********************************************************************
Title: Microsoft Security Update Minor Revisions
Issued: October 9, 2018
********************************************************************

Summary
=======

The following CVE has undergone a minor revision increment:

* CVE-2018-8531

Revision Information:
=====================

- CVE-2018-8531 | Azure IoT Device Client SDK Memory Corruption
Vulnerability
-...

          Azure Cloud Architect (Azure / GCP)      Cache   Translate Page      
AR-Bentonville, Azure Cloud Architect (Azure / GCP) Bentonville, AR Mandatory Skills IT architecture, infrastructure, and cloud development (Azure) Engineering and software architecture design - Stronger on Java tech stacks but well exposed to other stacks Business analysis DevOps Project and product management Excellent communication skills Deep analytical skills Project and resource management skills Quick Lear
          Azure Développer - Technologie Delan - Montréal, QC      Cache   Translate Page      
Very good working knowledge of Azure Learning Machines / HD insight / Spark / Databricks or any other cloud environment;...
From Technologie Delan - Tue, 09 Oct 2018 17:09:06 GMT - View all Montréal, QC jobs
          Développeur Azure - Technologie Delan - Montréal, QC      Cache   Translate Page      
Le Développeur Azure sera responsable de participer à la conception, au déploiement et l’amélioration continue des applications de l’entreprise PLUS...
From Technologie Delan - Tue, 09 Oct 2018 17:09:05 GMT - View all Montréal, QC jobs
          Technical Account Manager - Snowflake Computing - Alwal, Hyderabad, Telangana      Cache   Translate Page      
Amazon AWS, Microsoft Azure, OpenStack). Snowflake’s mission is to enable every organization to be data-driven with instant elasticity, secure data sharing and...
From Snowflake Computing - Tue, 11 Sep 2018 22:11:33 GMT - View all Alwal, Hyderabad, Telangana jobs
          Sr. Solution Architect - Snowflake Computing - Alwal, Hyderabad, Telangana      Cache   Translate Page      
Amazon AWS, Microsoft Azure, OpenStack, etc.). Snowflake’s mission is to enable every organization to be data-driven with instant elasticity, secure data...
From Snowflake Computing - Tue, 11 Sep 2018 22:11:32 GMT - View all Alwal, Hyderabad, Telangana jobs
          Partner Solutions Architect - Snowflake Computing - Alwal, Hyderabad, Telangana      Cache   Translate Page      
Amazon AWS, Microsoft Azure, OpenStack, etc.). Snowflake’s mission is to enable every organization to be data-driven with instant elasticity, secure data...
From Snowflake Computing - Fri, 31 Aug 2018 04:11:36 GMT - View all Alwal, Hyderabad, Telangana jobs
          Consultant MSBI Senior (H/F) – Montréal – Perm – Jusqu’à 100k CAD - Elitesoft - Montréal, QC      Cache   Translate Page      
Canada USA US Montréal Québec Azure Machine Learning Azure HD insights MSBI Microsoft BI Business Intelligence SSIS SSRS SSAS Power BI Cubes OLAP... $60,000 - $100,000 a year
From Indeed - Fri, 28 Sep 2018 14:24:37 GMT - View all Montréal, QC jobs
          Développeur Big Data - EXIA - Montréal, QC      Cache   Translate Page      
Posséder de l’expérience sur Microsoft Azure / HD Insight (Hortonworks) ; Vous êtes passionné par le Big Data, vous êtes familier avec les écosystèmes Pig, Hive...
From EXIA - Tue, 18 Sep 2018 11:29:35 GMT - View all Montréal, QC jobs
          Software Engineer II - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 04 Oct 2018 19:23:52 GMT - View all Redmond, WA jobs
          Principal Software Engineer - Microsoft - Redmond, WA      Cache   Translate Page      
Azure storage already runs at Exascale (storing Exabytes of data) and we will be scaling our designs over the next decade to support Zettascale (storing...
From Microsoft - Thu, 04 Oct 2018 19:23:52 GMT - View all Redmond, WA jobs
          Principal Program Manager - Microsoft - Redmond, WA      Cache   Translate Page      
Job Description: The Azure Big Data Team is looking for a Principal Program Manager to drive Azure and Office Compliance in the Big Data Analytics Services ...
From Microsoft - Sat, 28 Jul 2018 02:13:20 GMT - View all Redmond, WA jobs
          Subject Matter Expert (Paulo Alto, Meraci, Solar Wind)      Cache   Translate Page      
India - ) Technical Competency DC Experience- Cisco Nexus 7k, Cisco Nexus 5k, ASR. Firewall - Juniper, Paulo Alto Infoblox VPN Azure Solar Wind...
          Shorty Smalls      Cache   Translate Page      
$9.95

          Accessing Public Folder data from a Microsoft Teams Tab application using EWS      Cache   Translate Page      
For a long time now Public Folders have been a good way to share information and have threaded conversations across organizations. However time and technology have surpassed them somewhat of late, if your using Office365 there was first the introduction of Unified Groups and now Microsoft Teams has come to maturity both of which offer a better user and technical solution to needs Public Folders fulfil. The benefits of these newer technologies is they allow you to do more in the same context while not giving up on the existing features that Public Folders may have been giving you (outside of access in Outlook). But where the rubber meets the road at the coal face its not always as easy to migrate from one solution to another, so in this post I'm going to look at how you can access data (email etc) in an Exchange Public folder from within a Team's Tab application. This type of approach may give you some flexibility and options while your dealing with a Migration between the two or just during that tricky cutover period where you don't have everyone onboard at once.


What it looks like


the Text box is a simple search bar that takes a KQL query so by default it will just return everything but if you entering a query and hit the refresh button you will get filtered results eg





Technical Challenges  

Over the past couple of months I've posted a few Teams tab applications that expose Exchange data using the Graph API as the interface into the Exchange Store. Public Folders aren't yet accessible via the Graph API so to access these you need to use Exchange Web Services. While this isn't too difficult in itself because EWS doesn't support CORS it means you can't write JavaScript code that will run in the client browser to access EWS directly.

Cross-Origin Resource Sharing (CORS) is a mechanism that uses additional HTTPheaders to tell a browser to let a web application running at one origin (domain) have permission to access selected resources from a server at a different origin. In the context of a Teams tab application that wants to access Public folders via EWS the source domain will be where the teams tab application pages are being hosted and the target would be EWS outlook.office365.com, because the server where EWS is hosted doesn't support CORS the browser will block this request because the correct headers aren't returned when the request is pre-flighted (http Options request)
To mitigate against CORS you can use a Proxy like this node.js server https://www.npmjs.com/package/cors-anywhere then if your writing your client side code in something like Angular you can proxy your requests via this. Another way which is the one I've chosen is to write all the EWS logic and EWS calls in node.js and then create a simple Rest API for the Node server that can be called from the client side code that runs in Teams. This has a few benefits as there are multiple requests to get the public Folder and correct routing header initially. With this method you can run all these from a cloud hosted server that should have lower latency then the client which should mean better performance and also the service itself can be reused from other applications.

Node.js app

The node.js server is a relatively straight forward app that uses Express and has one route that become the REST endpoint that the Teams Tab application will call. I've used a JavaScript port somebody did of the EWS Managed API https://github.com/gautamsi/ews-javascript-api  which I found was reasonably easy to use and made dealing with the EWS side of the code a lot faster because I could port over existing C# code. One of the least fun things to do when dealing with Public Folder and EWS is to workout the correct routing headers https://docs.microsoft.com/en-us/exchange/client-developer/exchange-web-services/public-folder-access-with-ews-in-exchange but I have some code that will handle this and also finding the Public folder from a path using multiple shallow traversals. To allow paging past 1000 object (or I made the default page size about 100 to maintain reasonable performance) the rest endpoint support calling it again with an EWS folderId as to avoid the discover process a second time. Other then that the EWS code is pretty run of the mill findItems, folderfinds and bind.

Teams Tab application

Like the other Teams tab applications I wrote this use the Teams Tab Silent authentication method but instead of getting a token for the Graph Endpoint its gets it for the EWS endpoint outlook.office65.com. For displaying the email data I've used the tabulator JavaScript library which has a few nice feature I like.  Each of the email is clickable and will open any of the items you click up in a new tab in OWA eg