Next Page: 10000

          Die beliebtesten Programmiersprachen: Python behauptet sich vor C++ und C      Cache   Translate Page   Web Page Cache   
Python hat seine Spitzenposition im IEEE-Ranking der beliebtesten Programmiersprachen verteidigt. Auf den Plätzen folgen C++ und C. R dagegen verliert zunehmend an Bedeutung. Im TIOBE-Index ist Python drauf und dran, erstmals die Top-3 zu erobern. Das konkurrierende Ranking der beliebtesten Programmiersprachen des Institute of Electrical and Electronics Engineers (IEEE) sieht Python schon seit dem vergangenen […]
          Google 为树莓派 Zero W 发布了基于TensorFlow 的视觉识别套件      Cache   Translate Page   Web Page Cache   

Google 发布了一个 45 美元的 “AIY Vision Kit”,它是运行在树莓派 Zero W 上的基于 TensorFlow 的视觉识别开发套件,它使用了一个带 Movidius 芯片的 “VisionBonnet” 板。

为加速该设备上的神经网络,Google 的 AIY 视频套件继承了早期树莓派上运行的 AIY 项目 的语音/AI 套件,这个型号的树莓派随五月份的 MagPi 杂志一起赠送。与语音套件和老的 Google 硬纸板 VR 查看器一样,这个新的 AIY 视觉套件也使用一个硬纸板包装。这个套件和 Cloud Vision API 是不一样的,它使用了一个在 2015 年演示过的基于树莓派的 GoPiGo 机器人,它完全在本地的处理能力上运行,而不需要使用一个云端连接。这个 AIY 视觉套件现在可以 45 美元的价格去预订,将在 12 月份发货。

 

AIY 视觉套件,完整包装(左)和树莓派 Zero W

这个套件的主要处理部分除了所需要的 树莓派 Zero W 单片机之外 —— 一个基于 ARM11 的 1 GHz 的 Broadcom BCM2836 片上系统,另外的就是 Google 最新的 VisionBonnet RPi 附件板。这个 VisionBonnet pHAT 附件板使用了一个 Movidius MA2450,它是 Movidius Myriad 2 VPU 版的处理器。在 VisionBonnet 上,处理器为神经网络运行了 Google 的开源机器学习库 TensorFlow。因为这个芯片,使得视觉处理的速度最高达每秒 30 帧。

这个 AIY 视觉套件要求用户提供一个树莓派 Zero W、一个 树莓派摄像机 v2、以及一个 16GB 的 micro SD 卡,它用来下载基于 Linux 的 OS 镜像。这个套件包含了 VisionBonnet、一个 RGB 街机风格的按钮、一个压电扬声器、一个广角镜头套件、以及一个包裹它们的硬纸板。还有一些就是线缆、支架、安装螺母,以及连接部件。

 

AIY 视觉套件组件(左)和 VisonBonnet 附件板

有三个可用的神经网络模型。一个是通用的模型,它可以识别常见的 1000 个东西,一个是面部检测模型,它可以对 “快乐程度” 进行评分,从 “悲伤” 到 “大笑”,还有一个模型可以用来辨别图像内容是狗、猫、还是人。这个 1000 个图片模型源自 Google 的开源 MobileNets,它是基于 TensorFlow 家族的计算机视觉模型,它设计用于资源受限的移动或者嵌入式设备。

MobileNet 模型是低延时、低功耗,和参数化的,以满足资源受限的不同使用情景。Google 说,这个模型可以用于构建分类、检测、嵌入、以及分隔。在本月的早些时候,Google 发布了一个开发者预览版,它是一个对 Android 和 iOS 移动设备友好的 TensorFlow Lite 库,它与 MobileNets 和 Android 神经网络 API 是兼容的。

AIY 视觉套件包装图

除了提供这三个模型之外,AIY 视觉套件还提供了基本的 TensorFlow 代码和一个编译器,因此用户可以去开发自己的模型。另外,Python 开发者可以写一些新软件去定制 RGB 按钮颜色、压电元素声音、以及在 VisionBonnet 上的 4x GPIO 针脚,它可以添加另外的指示灯、按钮、或者伺服机构。Potential 模型包括识别食物、基于可视化输入来打开一个狗门、当你的汽车偏离车道时发出文本信息、或者根据识别到的人的面部表情来播放特定的音乐。

 

Myriad 2 VPU 结构图(左)和参考板

Movidius Myriad 2 处理器在一个标称 1W 的功耗下提供每秒万亿次浮点运算的性能。在被 Intel 收购之前,这个芯片最早出现在 Tango 项目的参考平台上,并内置在 2016 年 5 月由 Movidius 首次亮相的、Ubuntu 驱动的 USB 的 Fathom 神经网络处理棒中。根据 Movidius 的说法,Myriad 2 目前已经在 “市场上数百万的设备上使用”。

更多信息

AIY 视觉套件可以在 Micro Center 上预订,价格为 $44.99,预计在(2017 年) 12 月初发货。更多信息请参考 AIY 视觉套件的 公告Google 博客、以及 Micro Center 购物页面


via: http://linuxgizmos.com/google-launches-tensorflow-based-vision-recognition-kit-for-rpi-zero-w/

作者:Eric Brown 译者:qhwdw 校对:wxy

本文由 LCTT 原创编译,Linux中国 荣誉推出


          献给 Debian 和 Ubuntu 用户的一组实用程序      Cache   Translate Page   Web Page Cache   

你使用的是基于 Debian 的系统吗?如果是,太好了!我今天在这里给你带来了一个好消息。先向 “Debian-goodies” 打个招呼,这是一组基于 Debian 系统(比如:Ubuntu、Linux Mint)的有用工具。这些实用工具提供了一些额外的有用的命令,这些命令在基于 Debian 的系统中默认不可用。通过使用这些工具,用户可以找到哪些程序占用更多磁盘空间,更新系统后需要重新启动哪些服务,在一个软件包中搜索与模式匹配的文件,根据搜索字符串列出已安装的包等等。在这个简短的指南中,我们将讨论一些有用的 Debian 的好东西。

Debian-goodies – 给 Debian 和 Ubuntu 用户的实用程序

debian-goodies 包可以在 Debian 和其衍生的 Ubuntu 以及其它 Ubuntu 变体(如 Linux Mint)的官方仓库中找到。要安装 debian-goodies,只需简单运行:

$ sudo apt-get install debian-goodies

debian-goodies 安装完成后,让我们继续看一看一些有用的实用程序。

1、 checkrestart

让我从我最喜欢的 checkrestart 实用程序开始。安装某些安全更新时,某些正在运行的应用程序可能仍然会使用旧库。要彻底应用安全更新,你需要查找并重新启动所有这些更新。这就是 checkrestart 派上用场的地方。该实用程序将查找哪些进程仍在使用旧版本的库,然后,你可以重新启动服务。

在进行库更新后,要检查哪些守护进程应该被重新启动,运行:

$ sudo checkrestart
[sudo] password for sk:
Found 0 processes using old versions of upgraded files

由于我最近没有执行任何安全更新,因此没有显示任何内容。

请注意,checkrestart 实用程序确实运行良好。但是,有一个名为 needrestart 的类似的新工具可用于最新的 Debian 系统。needrestart 的灵感来自 checkrestart 实用程序,它完成了同样的工作。 needrestart 得到了积极维护,并支持容器(LXC、 Docker)等新技术。

以下是 needrestart 的特点:

  • 支持(但不要求)systemd
  • 二进制程序的黑名单(例如:用于图形显示的显示管理器)
  • 尝试检测挂起的内核升级
  • 尝试检测基于解释器的守护进程所需的重启(支持 Perl、Python、Ruby)
  • 使用钩子完全集成到 apt/dpkg 中

它在默认仓库中也可以使用。所以,你可以使用如下命令安装它:

$ sudo apt-get install needrestart

现在,你可以使用以下命令检查更新系统后需要重新启动的守护程序列表:

$ sudo needrestart
Scanning processes...
Scanning linux images...

Running kernel seems to be up-to-date.

Failed to check for processor microcode upgrades.

No services need to be restarted.

No containers need to be restarted.

No user sessions are running outdated binaries.

好消息是 Needrestart 同样也适用于其它 Linux 发行版。例如,你可以从 Arch Linux 及其衍生版的 AUR 或者其它任何 AUR 帮助程序来安装,就像下面这样:

$ yaourt -S needrestart

在 Fedora:

$ sudo dnf install needrestart

2、 check-enhancements

check-enhancements 实用程序用于查找那些用于增强已安装的包的软件包。此实用程序将列出增强其它包但不是必须运行它的包。你可以通过 -ip–installed-packages 选项来查找增强单个包或所有已安装包的软件包。

例如,我将列出增强 gimp 包功能的包:

$ check-enhancements gimp
gimp => gimp-data: Installed: (none) Candidate: 2.8.22-1
gimp => gimp-gmic: Installed: (none) Candidate: 1.7.9+zart-4build3
gimp => gimp-gutenprint: Installed: (none) Candidate: 5.2.13-2
gimp => gimp-help-ca: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-de: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-el: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-en: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-es: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-fr: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-it: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-ja: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-ko: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-nl: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-nn: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-pt: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-ru: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-sl: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-help-sv: Installed: (none) Candidate: 2.8.2-0.1
gimp => gimp-plugin-registry: Installed: (none) Candidate: 7.20140602ubuntu3
gimp => xcftools: Installed: (none) Candidate: 1.0.7-6

要列出增强所有已安装包的,请运行:

$ check-enhancements -ip
autoconf => autoconf-archive: Installed: (none) Candidate: 20170928-2
btrfs-progs => snapper: Installed: (none) Candidate: 0.5.4-3
ca-certificates => ca-cacert: Installed: (none) Candidate: 2011.0523-2
cryptsetup => mandos-client: Installed: (none) Candidate: 1.7.19-1
dpkg => debsig-verify: Installed: (none) Candidate: 0.18
[...]

3、 dgrep

顾名思义,dgrep 用于根据给定的正则表达式搜索制指定包的所有文件。例如,我将在 Vim 包中搜索包含正则表达式 “text” 的文件。

$ sudo dgrep "text" vim
Binary file /usr/bin/vim.tiny matches
/usr/share/doc/vim-tiny/copyright: that they must include this license text. You can also distribute
/usr/share/doc/vim-tiny/copyright: include this license text. You are also allowed to include executables
/usr/share/doc/vim-tiny/copyright: 1) This license text must be included unmodified.
/usr/share/doc/vim-tiny/copyright: text under a) applies to those changes.
/usr/share/doc/vim-tiny/copyright: context diff. You can choose what license to use for new code you
/usr/share/doc/vim-tiny/copyright: context diff will do. The e-mail address to be used is
/usr/share/doc/vim-tiny/copyright: On Debian systems, the complete text of the GPL version 2 license can be
[...]

dgrep 支持大多数 grep 的选项。参阅以下指南以了解 grep 命令。

4、 dglob

dglob 实用程序生成与给定模式匹配的包名称列表。例如,找到与字符串 “vim” 匹配的包列表。

$ sudo dglob vim
vim-tiny:amd64
vim:amd64
vim-common:all
vim-runtime:all

默认情况下,dglob 将仅显示已安装的软件包。如果要列出所有包(包括已安装的和未安装的),使用 -a 标志。

$ sudo dglob vim -a

5、 debget

debget 实用程序将在 APT 的数据库中下载一个包的 .deb 文件。请注意,它只会下载给定的包,不包括依赖项。

$ debget nano
Get:1 http://in.archive.ubuntu.com/ubuntu bionic/main amd64 nano amd64 2.9.3-2 [231 kB]
Fetched 231 kB in 2s (113 kB/s)

6、 dpigs

这是此次集合中另一个有用的实用程序。dpigs 实用程序将查找并显示那些占用磁盘空间最多的已安装包。

$ dpigs
260644 linux-firmware
167195 linux-modules-extra-4.15.0-20-generic
75186 linux-headers-4.15.0-20
64217 linux-modules-4.15.0-20-generic
55620 snapd
31376 git
31070 libicu60
28420 vim-runtime
25971 gcc-7
24349 g++-7

如你所见,linux-firmware 包占用的磁盘空间最多。默认情况下,它将显示占用磁盘空间的 前 10 个包。如果要显示更多包,例如 20 个,运行以下命令:

$ dpigs -n 20

7. debman

debman 实用程序允许你轻松查看二进制文件 .deb 中的手册页而不提取它。你甚至不需要安装 .deb 包。以下命令显示 nano 包的手册页。

$ debman -f nano_2.9.3-2_amd64.deb nano

如果你没有 .deb 软件包的本地副本,使用 -p 标志下载并查看包的手册页。

$ debman -p nano nano

建议阅读:

8、 debmany

安装的 Debian 包不仅包含手册页,还包括其它文件,如确认、版权和自述文件等。debmany 实用程序允许你查看和读取那些文件。

$ debmany vim

使用方向键选择要查看的文件,然后按回车键查看所选文件。按 q 返回主菜单。

如果未安装指定的软件包,debmany 将从 APT 数据库下载并显示手册页。应安装 dialog 包来阅读手册页。

9、 popbugs

如果你是开发人员,popbugs 实用程序将非常有用。它将根据你使用的包显示一个定制的发布关键 bug 列表(使用 popularity-contest 数据)。对于那些不关心的人,popularity-contest 包设置了一个 cron (定时)任务,它将定期匿名向 Debian 开发人员提交有关该系统上最常用的 Debian 软件包的统计信息。这些信息有助于 Debian 做出决定,例如哪些软件包应该放在第一张 CD 上。它还允许 Debian 改进未来的发行版本,以便为新用户自动安装最流行的软件包。

要生成严重 bug 列表并在默认 Web 浏览器中显示结果,运行:

$ popbugs

此外,你可以将结果保存在文件中,如下所示。

$ popbugs --output=bugs.txt

10、 which-pkg-broke

此命令将显示给定包的所有依赖项以及安装每个依赖项的时间。通过使用此信息,你可以在升级系统或软件包之后轻松找到哪个包可能会在什么时间损坏了另一个包。

$ which-pkg-broke vim
Package <debconf-2.0> has no install time info
debconf Wed Apr 25 08:08:40 2018
gcc-8-base:amd64 Wed Apr 25 08:08:41 2018
libacl1:amd64 Wed Apr 25 08:08:41 2018
libattr1:amd64 Wed Apr 25 08:08:41 2018
dpkg Wed Apr 25 08:08:41 2018
libbz2-1.0:amd64 Wed Apr 25 08:08:41 2018
libc6:amd64 Wed Apr 25 08:08:42 2018
libgcc1:amd64 Wed Apr 25 08:08:42 2018
liblzma5:amd64 Wed Apr 25 08:08:42 2018
libdb5.3:amd64 Wed Apr 25 08:08:42 2018
[...]

11、 dhomepage

dhomepage 实用程序将在默认 Web 浏览器中显示给定包的官方网站。例如,以下命令将打开 Vim 编辑器的主页。

$ dhomepage vim

这就是全部了。Debian-goodies 是你武器库中必备的工具。即使我们不经常使用所有这些实用程序,但它们值得学习,我相信它们有时会非常有用。

我希望这很有用。更多好东西要来了。敬请关注!

干杯!


via: https://www.ostechnix.com/debian-goodies-a-set-of-useful-utilities-for-debian-and-ubuntu-users/

作者:SK 选题:lujun9972 译者:MjSeven 校对:wxy

本文由 LCTT 原创编译,Linux中国 荣誉推出


          A Guide to Python's Magic Methods (2012)      Cache   Translate Page   Web Page Cache   
Comments
          Clinical Informatics Data Analyst -SKH - OCI Inc. - Regina, SK      Cache   Translate Page   Web Page Cache   
Experience/knowledge of SQL/TSQL, XML a scripting language (Bash / JavaScript / Python / Perl, etc.). Only professionals with *IM, IT, IS *and/or *ICT ...
From Indeed - Mon, 16 Jul 2018 19:47:15 GMT - View all Regina, SK jobs
          Hands-On Data Analysis with NumPy and Pandas      Cache   Translate Page   Web Page Cache   

eBook Details: Paperback: 168 pages Publisher: WOW! eBook (June 29, 2018) Language: English ISBN-10: 1789530792 ISBN-13: 978-1789530797 eBook Description: Hands-On Data Analysis with NumPy and Pandas: Get to grips with the most popular Python packages that make Data Analysis possible and implement Python packages from data manipulation to processing

The post Hands-On Data Analysis with NumPy and Pandas appeared first on WOW! eBook: Free eBooks Download.


          Julia 1.0 Released, 2018 State of Rust Survey, Samsung Galaxy Note 9 Launches Today, Margaret Dawson of Red Hat Named Business Role Model of the Year in Women in IT Awards and Creative Commons Awarded $800,000 from Arcadia       Cache   Translate Page   Web Page Cache   

News briefs for August 9, 2018.

Julia 1.0 made its debut yesterday—the "culmination of nearly a decade of work to build a language for greedy programmers". The language's goal: "We want a language that's open source, with a liberal license. We want the speed of C with the dynamism of Ruby. We want a language that's homoiconic, with true macros like Lisp, but with obvious, familiar mathematical notation like Matlab. We want something as usable for general programming as Python, as easy for statistics as R, as natural for string processing as Perl, as powerful for linear algebra as Matlab, as good at gluing programs together as the shell. Something that is dirt simple to learn, yet keeps the most serious hackers happy. We want it interactive and we want it compiled." You can download it here.

The Rust Community announced the 2018 State of Rust Survey, and they want your opinions to help them establish future development priorities. The survey should take 10–15 minutes to complete, and is available here. And, you can see last year's results here.

Samsung Galaxy Note 9 launches today at 11am ET. You can watch the spectacle via Android Central, which will be streaming the live event.

Margaret Dawson, Vice President, Portfolio Product Marketing at Red Hat, was named Business Role Model of the Year at the inaugural Women in IT Awards USA. The awards were organized by Information Age to "redress the gender imbalance by showcasing the achievements of women in the sector and identifying new role models".

Creative Commons was awarded $800,000 from Arcadia (a charitable fund of Lisbet Rausing and Peter Baldwin) to support CC Search, which is "a Creative Commons technology project designed to maximize discovery and use of openly licensed content in the Commons". CC Search, along with Commons Metadata Library and the Commons API, plans to form the Commons Collaborative Archive and Library, a suite of tools that will "make the global commons of openly licensed content more searchable, usable, and resilient, and to provide essential infrastructure for collaborative online communities".


          How to fix "IOError: decoder jpeg not available"      Cache   Translate Page   Web Page Cache   
I have stuck when I compile these code in SageNotebook: from PIL import Image img=Image.open("/home/pmath/Music/im1.jpg") img2=img.convert("L") img2.save("/home/pmath/Music/secretimage.pgm") img=Image.open("/home/pmath/Music/secretimage.pgm") pix=img.load() print pix The following errors occur : Traceback (click to the left of this block for traceback) ... IOError: decoder jpeg not available But when I compile these codes with python on the same OS(ubuntu32bit 16.04) it works well. The problems occur only on Sage. How can i fix this issue?
          Business Analyst      Cache   Translate Page   Web Page Cache   
FL-Tampa, Tampa, Florida Skills : C+ and Python, Finance, Credit, Market and Operational Risk Management concepts Description : • Strong quantitative background with a degree in Engineering, Computer Science or Mathematics required. • Strong programming skills (acquired academically or through hands-on experience); preference for C+ and Python. • Strong analytical and problem solving skills • Capable of per
          JAVA/Python Developer - CCIT Consulting - Reston, VA      Cache   Translate Page   Web Page Cache   
* The candidate is expected to develop governance controls defined in EDL Data Governance Framework in AWS Cloud * 5+ years of programming experience,... $60 - $65 an hour
From Indeed - Sun, 29 Jul 2018 16:53:50 GMT - View all Reston, VA jobs
          Natural Language programming with report writing      Cache   Translate Page   Web Page Cache   
I need a python expert to do a natural language programming job for me to be submitted by Saturday night. I am going to attach the file. The rest of the details will be shared with the awarded freelancer via chat... (Budget: $30 - $250 USD, Jobs: Natural Language, Python)
          Релиз языка Julia 1.0      Cache   Translate Page   Web Page Cache   

Спустя 9 лет активной разработки, свет увидел язык для инженерных и технических вычислений Julia 1.0!

Авторы языка многие годы работали с такими языками, как R, Matlab, Fortran и Python. Каждый язык, по мнению авторов, имел свои изъяны и недостатки, вследствие чего было решено разработать современный универсальный специализированный язык, который заменит все вышеперечисленные языки в инженерно-технических вычислениях.

( читать дальше... )

 , ,


          IC Resources Ltd: Software Developer - C++/ Python - Contract      Cache   Translate Page   Web Page Cache   
Up to £400 per day: IC Resources Ltd: Software developer, Bristol, up to £400 pd, 6 monthsIC Resources is looking for a software developer (C++/ Python)(contract) to work in the Bristol area to work on a new video streaming security network. The requirements for this contract are;Strong C++ s Bristol
          Jr-Mid Level Software Engineer - IDEMIA - Morgantown, WV      Cache   Translate Page   Web Page Cache   
Knowledge or interest in multiple technology domains and languages e.g. Java, JavaScript, Go, Python, etc. As a software engineer for IDEMIA NSS, the successful...
From IDEMIA - Sun, 05 Aug 2018 08:52:20 GMT - View all Morgantown, WV jobs
          Watson Explorer Solution Architect - Perficient - National, WV      Cache   Translate Page   Web Page Cache   
Java, Ruby, Python, CSS, Javascript, HTML, AJAX, REST API/SOAP UI/Web Services, XSL, XSLT, Node.js. Experience with Java, Ruby, Python, CSS, Javascript, HTML,...
From Perficient - Sun, 01 Jul 2018 14:49:06 GMT - View all National, WV jobs
          Watson Explorer Advanced Solution Architect - Perficient - National, WV      Cache   Translate Page   Web Page Cache   
Java, Ruby, Python, CSS, Javascript, HTML, AJAX, REST API/SOAP UI/Web Services, XSL, XSLT, Node.js. At Perficient you’ll deliver mission-critical technology and...
From Perficient - Sun, 01 Jul 2018 08:48:56 GMT - View all National, WV jobs
          Python Resurrects Dot Matrix Printing      Cache   Translate Page   Web Page Cache   

These days a printer — especially one at home — is likely to spray ink out of nozzles. It is getting harder to find home laser printers, and earlier printer technologies such as dot matrix are almost gone from people’s homes even if you’ll still see a few printing multipart forms in some offices.

[Thomas Winningham] bought an old Commodore dot matrix printer in a fast food parking lot for $20. How hard could it be to get it working? How hard, indeed. Check out the video below to see the whole adventure. The principle behind the printer is simple …read more


          Senior Python Engineer      Cache   Translate Page   Web Page Cache   
MN-Eden Prairie, job summary: We need a good solid Python Resource who has multiple years of experience in developing applications using Python. Having knowledge and ability to work with Big Data technologies will be a plus. Someone who can articulate design/solution using best practices in Python and question current state when needed and bring in best practices and educate others to increase their skills as well
          Sr. Python/Java Developer with AI      Cache   Translate Page   Web Page Cache   
NC-Morrisville, Senior Python Software Engineer for AI development 6+ Months RTP NC The Python Software Engineer will be part of a new stealth group building the next generation of smart machine cloud-based solutions. This provides an opportunity to get in on the ground floor level of something new and exciting for an industry working with the latest and greatest technologies. RESPONSIBILITIES Participate in cutt
          Automate downloading web data and importing into Oracle or MS SQL server      Cache   Translate Page   Web Page Cache   
I need to automatically download data from the following site: https://data.medicaid.gov/Drug-Pricing-and-Payment/Drug-Products-in-the-Medicaid-Drug-Rebate-Program/v48d-4e3e/data I need two things to happen: 1... (Budget: $30 - $250 USD, Jobs: .NET, Microsoft SQL Server, Oracle, Python, SQL)
          Data Analysis From Scratch With Python Step By Step Guide      Cache   Translate Page   Web Page Cache   

Data Analysis From Scratch With Python Step By Step Guide

Data Analysis From Scratch With Python: Step By Step Guide by Peters Morgan
English | 24 Jun. 2018 | ASIN: B07F193447 | 150 Pages | EPUB | 1.46 MB


          Automate downloading web data and importing into Oracle or MS SQL server      Cache   Translate Page   Web Page Cache   
I need to automatically download data from the following site: https://data.medicaid.gov/Drug-Pricing-and-Payment/Drug-Products-in-the-Medicaid-Drug-Rebate-Program/v48d-4e3e/data I need two things to happen: 1... (Budget: $30 - $250 USD, Jobs: .NET, Microsoft SQL Server, Oracle, Python, SQL)
          Automate downloading web data and importing into Oracle or MS SQL server      Cache   Translate Page   Web Page Cache   
I need to automatically download data from the following site: https://data.medicaid.gov/Drug-Pricing-and-Payment/Drug-Products-in-the-Medicaid-Drug-Rebate-Program/v48d-4e3e/data I need two things to happen: 1... (Budget: $30 - $250 USD, Jobs: .NET, Microsoft SQL Server, Oracle, Python, SQL)
          Storage Technologies-Development Software Engineer - Mphasis - Bengaluru, Karnataka      Cache   Translate Page   Web Page Cache   
.3 to 5 years of experience in Coding in Python test automation • Strong automation coding experience in Python - Must. • Experience in REST Knowledge- Must. ...
From Mphasis - Mon, 06 Aug 2018 12:28:25 GMT - View all Bengaluru, Karnataka jobs
          Big Data Developer, R&D - Fleet Complete - Toronto, ON      Cache   Translate Page   Web Page Cache   
Experience developing software in at least 2 different languages, one of which must be R, Python, Scala, Java, JavaScript, or C#....
From Fleet Complete - Wed, 30 May 2018 04:30:23 GMT - View all Toronto, ON jobs
          Quality Assurance Analyst - Adastra Corporation - Toronto, ON      Cache   Translate Page   Web Page Cache   
Experience with Java and C/C++ is an asset. Strong development experience with UNIX, scripting languages (Kornshell, Python, Perl), Java, C/C++....
From Adastra Corporation - Fri, 27 Apr 2018 08:03:24 GMT - View all Toronto, ON jobs
          Data Integration / ETL Developer - Adastra Corporation - Toronto, ON      Cache   Translate Page   Web Page Cache   
Experience with UNIX, scripting languages (Korn Shell, Python, Perl), C or Java. Analyze, design, develop, test and document ETL processes....
From Adastra Corporation - Sat, 28 Apr 2018 08:04:44 GMT - View all Toronto, ON jobs
          Postgres App 2.1.5 - Full-featured PostgreSQL in a single package. (Free)      Cache   Translate Page   Web Page Cache   

The Postgres app contains a full-featured PostgreSQL installation in a single package: PostgreSQL, PostGIS, several procedural languages: PL/pgSQL, PL/Perl, PL/Python, and PLV8 (Javascript) Popular extensions, including hstore and uuid-ossp, and more, including a number of command-line utilities for managing PostgreSQL and working with GIS data



Version 2.1.4:

This is the August 2018 quarterly update for PostgreSQL.

  • PostgreSQL 10.5 with PostGIS 2.4.4
  • PostgreSQL 9.6.10 with PostGIS 2.3.7
  • PostgreSQL 9.5.14 with PostGIS 2.2.7


  • OS X 10.10 or later



More information

Download Now
          Automate downloading web data and importing into Oracle or MS SQL server      Cache   Translate Page   Web Page Cache   
I need to automatically download data from the following site: https://data.medicaid.gov/Drug-Pricing-and-Payment/Drug-Products-in-the-Medicaid-Drug-Rebate-Program/v48d-4e3e/data I need two things to happen: 1... (Budget: $30 - $250 USD, Jobs: .NET, Microsoft SQL Server, Oracle, Python, SQL)
          Australian man films 9-foot python on his house      Cache   Translate Page   Web Page Cache   
A man in Australia spotted a snake crawling around his garage, and says it refuses to leave. The persistent python named Dirk is 9 feet long and has an interesting hobby
          Установка второго Linux для изучения Python      Cache   Translate Page   Web Page Cache   
Программирование
Статистика : 30 Ответы || 426 Просмотры Last post by arseniiv
          Instructor – Software Development (Continuous) - Bow Valley College - Calgary, AB      Cache   Translate Page   Web Page Cache   
Expertise in all or a combination of C/C++, C#, ASP.NET, Javascript, PHP, SQL, Xamarin, node, React, Java, Kotlin, HTML, CSS, Python....
From Bow Valley College - Mon, 30 Jul 2018 22:00:42 GMT - View all Calgary, AB jobs
          Instructor – Software Development (Temporary) - Bow Valley College - Calgary, AB      Cache   Translate Page   Web Page Cache   
Expertise in all or a combination of C/C++, C#, ASP.NET, Javascript, PHP, SQL, Xamarin, node, React, Java, Kotlin, HTML, CSS, Python....
From Bow Valley College - Mon, 30 Jul 2018 22:00:42 GMT - View all Calgary, AB jobs
          JetBrains PyCharm 2018.1.4 Crack [License Key + Full + Final]      Cache   Translate Page   Web Page Cache   

JetBrains PyCharm 2018.1.4 Crack [License Key + Full + Final] JetBrains PyCharm 2018.1.4 Crack is most advanced and powerful software. This software is developed for creating websites and applications in Python. Google and Twitter are one of those websites based on python build. Now you are thinking about python, that what is python? Python is […]

The post JetBrains PyCharm 2018.1.4 Crack [License Key + Full + Final] appeared first on pcsoft pro full version.


          Jr-Mid Level Software Engineer - IDEMIA - Morgantown, WV      Cache   Translate Page   Web Page Cache   
Knowledge or interest in multiple technology domains and languages e.g. Java, JavaScript, Go, Python, etc. As a software engineer for IDEMIA NSS, the successful...
From IDEMIA - Sun, 05 Aug 2018 08:52:20 GMT - View all Morgantown, WV jobs
          Watson Explorer Solution Architect - Perficient - National, WV      Cache   Translate Page   Web Page Cache   
Java, Ruby, Python, CSS, Javascript, HTML, AJAX, REST API/SOAP UI/Web Services, XSL, XSLT, Node.js. Experience with Java, Ruby, Python, CSS, Javascript, HTML,...
From Perficient - Sun, 01 Jul 2018 14:49:06 GMT - View all National, WV jobs
          Watson Explorer Advanced Solution Architect - Perficient - National, WV      Cache   Translate Page   Web Page Cache   
Java, Ruby, Python, CSS, Javascript, HTML, AJAX, REST API/SOAP UI/Web Services, XSL, XSLT, Node.js. At Perficient you’ll deliver mission-critical technology and...
From Perficient - Sun, 01 Jul 2018 08:48:56 GMT - View all National, WV jobs
          Network Engineer 3 - Python Platform Engineer      Cache   Translate Page   Web Page Cache   
CO-Englewood, RESPONSIBILITIES: Kforce has a client in search of a Network Engineer 3 - Python Platform Engineer in Englewood, Colorado (CO). Summary: The purpose of this position is to work with the Advanced Engineering team and assist in the design of software to provide programmability and discovery of network services. One is responsible for development of new software and data models for network automation
          python2-cairocffi 0.9.0-1 any      Cache   Translate Page   Web Page Cache   
cairocffi is a CFFI-based drop-in replacement for Pycairo, a set of Python bindings and object-oriented API for cairo.
          python-cairocffi 0.9.0-1 any      Cache   Translate Page   Web Page Cache   
cairocffi is a CFFI-based drop-in replacement for Pycairo, a set of Python bindings and object-oriented API for cairo.
          Full Stack Software Engineer - Parsons - Columbia, MD      Cache   Translate Page   Web Page Cache   
Parsons Cyber Operations is seeking Software Engineers with experience in Python, JavaScript, and Linux systems to join our team of exceptional individuals....
From Parsons Corporation - Wed, 01 Aug 2018 17:09:03 GMT - View all Columbia, MD jobs
          WhatsApp漏洞分析      Cache   Translate Page   Web Page Cache   

本文翻译自:
https://research.checkpoint.com/fakesapp-a-vulnerability-in-whatsapp/

作者:Dikla Barda, Roman Zaikin,Oded Vanunu


WhatsApp拥有用户15亿,有超过10亿个群组,每天发送消息超过650亿(2018年初数据)。大量的用户和消息规模下,出现垃圾邮件、谣言、虚假消息的概率也很大。

Check Point研究人员近期发现WhatsApp中存在漏洞,攻击者利用漏洞可以拦截和伪造个人聊天和群组聊天会话消息,这样攻击者就可以传播垃圾邮件、谣言、虚假消息了。

研究人员发现了三种利用该漏洞的攻击方法,这三种方法都是用社会工程技巧来欺骗终端用户。攻击者可以:
用群组聊天中的引用(quote)特征来改变发送者的身份,即使发送者不是群成员;
修改其他人的回复消息(以发送者的口吻);
伪装成公开消息,发送私聊消息给另一个群组成员,当目标个人回复后,会话中所有人都会看到该消息。

https://www.youtube.com/embed/rtSFaHPA0C4

技术分析

WhatsApp会加密发送的消息、图片、语言通话、视频通话和所有形式的内容,这样只有接收者能看到。但不止WhatsApp可以看到这些消息。

image

图1: WhatsApp加密的聊天

研究人员决定分析加密过程,对算法进行逆向来解密这些数据。解密了WhatsApp的通信后,研究人员发现WhatsApp使用的是protobuf2协议。

把protobuf2数据转变成json数据就可以看到发送的真实参数,然后研究人员伪造了参数数据来验证WhatsApp的安全性。

研究人员利用Burp Suit Extension and 3 Manipulation方法对其进行研究。
在伪造之前,研究人员先获取了session的公钥和私钥,并填入burpsuit扩展中。

访问密钥

在QR码生成之前,可以从WhatsApp web端的密钥生成阶段获取密钥:

image

图2: 通信用的公钥和私钥

想要获取密钥,就要获取用户扫描QR码后手机发给WhatsApp web端的秘密参数:

image

图3: WebSocket中的秘密密钥

扩展给出的结果:

image

图4: WhatsApp Decoder Burp Extension

点击连接(connect)后,扩展会连接到扩展的本地服务器,服务器会执行扩展所需的所有任务。

修改WhatsApp

解密了WhatsApp的通信后,就可以看到手机端WhatsApp和web端之间发送的所有参数。然后就可以伪造消息了,并以此检查WhatsApp的安全性。

三种攻击场景描述如下:

攻击1:在群组聊天中修改发送者身份,即使发送者不是群组成员

在这种攻击者,可以伪造回复消息来模仿另一个群组人员,即使该群组成员并不存在,比如Mickey Mouse。

为模仿群组中的人,攻击者需要抓取这样的加密流量:

image

图5: 加密的WhatsApp通信

一旦获取流量后,就可以发送给扩展,扩展会解密流量:

image

图6: 解密的WhatsApp消息

使用扩展

使用扩展时应注意以下参数:

  • Conversation,发送的真实内容;
  • participant,消息的真实发送者;
  • fromMe,该参数表明是否是我发送的数据;
  • remoteJid,表明数据发送的目的群组;
  • id,数据的id,手机的数据库中也会保存系统的id。

了解了这些参数之后就可以伪造会话消息了。比如,群成员发送的“great”内容可以修改为“I’m going to die, in a hospital right now”,参与者的参数也可以修改为其他人:

image

图7: 伪造的Reply消息

Id也有修改,因为数据库中已经存在该id了。

为了让每个人都看到伪造的信息,攻击者需要回复他伪造的消息,引用并修改原始消息(将great修改为其他),然后发送给群里的其他人。

如下图所示,研究人员创建了一个没有消息记录的新群组,然后使用上面的方法创建了假的回复:

image

图8: 原始会话

参数participant可以是文本或不在群中的某人的手机号,这会让群人员认为这真的是该成员发送的消息。比如:

image

图9: 修改消息内容

使用调试工具,结果就是:

image

image

图10: 回复来自群外人员的消息

攻击2:以发送者的口吻修改回复

在攻击2中,攻击者能以其他人的口吻发送消息以达到修改聊天的目的。这样,就可以模仿他人或完成欺骗交易。

为了伪造消息,必须修改消息的fromMe参数,表示在个人会话中发送消息。

从web端发送的消息在发送到Burp suite之前,我们对其进行分析。可以在aesCbcEncrypt函数上设置一个断点,从a参数出获取获取。

image

图11: OutGoing消息修改

然后复制数据到Burp扩展中,选择outgoing direction,然后解密数据:

image

图12: 解密Outgoing Message

在将其改为false,然后加密后,得到下面的结果:

image

图13: Outgoing Message加密

然后要修改浏览器的a参数,结果是含有内容的推送通知。这样甚至可以欺骗整个会话。

image

图14: 发送消息给自己

如果是其他人的话,整个会话应该是这样的:

image

图15: 发送消息给自己,别人看到的结果

攻击3:在群聊中发送私聊消息,但接收者回复时,整个群都可以看到回复内容

在这种攻击下,可以修改群组中的特定成员,在群聊中发送私聊信息,当接收者回复给消息时,整个群成员都可以看到回复的内容。

研究人员通过逆向安卓APP发现了攻击向量。在该实例中,研究人员发现如果攻击者在群中修改了一个消息,那么就会在数据库/data/data/com.whatsapp/databases/msgstore.db中看到该消息。

image

图16: 在群聊中发送私聊消息保存在/data/data/com.whatsapp/databases/msgstore.db数据库中

可以使用sqlite3客户端使用下面的命令打开会话:

SELECT * FROM messages;

可以看到下面的数据:

image

图17: 修改后的数据库

为了在群中发送消息,但限制消息只能某个特定群成员才能看到,因此要设定remote_resource参数。

这里的使用的方法就是将key_from_me参数从0修改为1
完成这些动作后,运行下面的命令,更新key_from_me和数据:

update messages set key_from_me=1,data=We, all know what have you done! where _id=2493;

攻击者需要重启WhatsApp客户端来强制引用发送新消息。之后的结果就是:

image

只有特定的受害者接收到了消息。

如果受害者写消息回应(writes something as a response),那么群组内的所有人都可以看到;但如果受害者直接回复(reply to)消息的话,只有他自己可以看到回复的内容,但其他人就可以看到原始消息。

image

源码:https://github.com/romanzaikin/BurpExtension-WhatsApp-Decryption-CheckPoint

WhatsApp Web端在生成QR码之前,会生成一对公约和私钥用于加密和解密。

image

图23: 会话用的公钥和私钥

以下称私钥为priv_key_list,称公钥为pub_key_list

密钥是用随机的32字节用curve25519_donna生成的。

image

图24: Curve25519加密过程

为了解密数据,需要创建解密码。这就需要从WhatsApp Web端提取私钥,因为需要私钥才可以解密数据:

self.conn_data[“private_key”] = curve25519.Private(“”.join([chr(x) for x in priv_key_list]))
self.conn_data[“public_key”]  = self.conn_data[“private_key”].get_public()
assert (self.conn_data[“public_key”].serialize() == “”.join([chr(x) for x in pub_key_list]))

然后,QR码就创建了,在用手机扫描QR码之后,就可以通过websocket发送信息给Whatsapp Web端了:

image

图25: 来自WebSocket的秘密密钥

最重要的参数是加密的,之后会传递给setSharedSecret。这会将密钥分成三个部分,并且配置所有解密WhatsApp流量所需的加密函数。

首先,是从字符串e到数组的翻译,有些部分会把密钥分成前32字节的n和第64字节到结尾ta两部分。

image

图26: 获取SharedSecret

深入分析函数E.SharedSecret,发现它使用前32字节和生成QR码的私钥作为两个参数:

image

图27: 获取SharedSecret

然后可以在python脚本中加入下面的代码:

self.conn_data[shared_secret] = self.conn_data[private_key].get_shared_key(curve25519.Public(self.conn_data[secret][:32]), lambda key: key)

然后是扩展的80字节:

image

图28: 扩展SharedSecret

分析发现该函数使用HKDF函数,所以看到了函数pyhkdf,还被用于扩展key:

shared_expended = self.conn_data[“shared_secret_ex”] = HKDF(self.conn_data[“shared_secret”], 80)

然后,hmac验证函数会将扩展的数据看作参数e,然后分成三部分:

  • i – shared_expended的前32字节
  • r – 32字节的32字节
  • o –64字节的16字节

还有一个参数s,用来将参数na连接在一起。

image

图29: HmacSha256

然后用参数r调用HmacSha256函数,函数会用参数s对数据进行签名,之后就收到hmac验证,并于r进行比较。

rt的32字节到64字节,t是数组格式的加密数据。

image

图30: 检查消息的有效性

Python代码如下:

check_hmac = HmacSha256(shared_expended[32:64], self.conn_data[secret][:32] + self.conn_data[secret][64:]) if check_hmac != self.conn_data[secret][32:64]:
raise ValueError(Error hmac mismatch)

最后与加密相关的函数是aesCbcDecrypt,它用参数s将64字节之后的扩展数据、扩展数据的前32字节(参数i)和secret 64字节之后的数据连接在一起。

image

图31: 获取AES key和MAC key

解密密钥随后会使用,然后对代码进行翻译:

keysDecrypted = AESDecrypt(shared_expended[:32], shared_expended[64:] + self.conn_data[secret][64:])

解密后,就得到t即前32字节数据,也就是加密密钥,之后的32字节数据就是mac密钥:

self.conn_data[“key”][“aes_key”] = keysDecrypted[:32]
self.conn_data[“key”][“mac_key”] = keysDecrypted[32:64]

整体代码如下:

self.conn_data[private_key] = curve25519.Private(“”.join([chr(x) for x in priv_key_list]))
self.conn_data[public_key]  = self.conn_data[private_key].get_public()

assert (self.conn_data[public_key].serialize() == “”.join([chr(x) for x in pub_key_list]))

self.conn_data[secret] = base64.b64decode(ref_dict[secret])
self.conn_data[shared_secret] = self.conn_data[private_key].get_shared_key(curve25519.Public(self.conn_data[secret][:32]), lambda key: key)

shared_expended = self.conn_data[shared_secret_ex] = HKDF(self.conn_data[shared_secret], 80)

check_hmac = HmacSha256(shared_expended[32:64], self.conn_data[secret][:32] + self.conn_data[secret][64:])

if check_hmac != self.conn_data[secret][32:64]:
raise ValueError(Error hmac mismatch)

keysDecrypted = AESDecrypt(shared_expended[:32], shared_expended[64:] + self.conn_data[secret][64:])

self.conn_data[key][aes_key] = keysDecrypted[:32]
self.conn_data[key][mac_key] = keysDecrypted[32:64]

有了生成QR码的所有加密参数,就可以加入解密过程了。

首先,拦截(获取)消息:

image

图32: 收到的加密后的消息

可以看到,消息是分成两部分的:tag和数据。可以用下面的函数解密消息:

def decrypt_incoming_message(self, message):
message = base64.b64decode(message)
message_parts = message.split(“,”, 1)
self.message_tag = message_parts[0]
content = message_parts[1]

check_hmac = hmac_sha256(self.conn_data[“mac_key”], content[32:])
if check_hmac != content[:32]:
raise ValueError(“Error hmac mismatch”)

self.decrypted_content = AESDecrypt(self.conn_data[“aes_key”], content[32:])
self.decrypted_seralized_content = whastsapp_read(self.decrypted_content, True)

return self.decrypted_seralized_content

从中可以看出,为了方便复制Unicode数据,接收的数据是base64编码的。在burp中,可以用ctrl+b对数据进行base64编码,然后传递给函数decrypt_incomping_message。函数会把tag与内容分割开,然后通过比较hmac_sha256(self.conn_data[“mac_key“], content[32:])content[:32]来检查密钥是否可以解密数据。

如果都匹配的话,那么继续进入AES解密步骤,需要使用AES Key和32字节的内容。
内容中含有IV,也就是aes区块的大小,然后是真实数据:

self.decrypted_content = AESDecrypt(self.conn_data[“aes_key”], content[32:])

函数的输出是protobuf(是google 的一种数据交换的格式,它独立于语言,独立于平台):

image

图33: Protobuf格式的加密数据

然后用whatsapp_read函数将其翻译为json格式。

解密收到的消息

为了解密收到的消息,首先要了解WhatsApp协议的工作原理,所以要调试函数e.decrypt

image

图34: ReadNode函数

ReadNode函数会触发readNode

image

图35: ReadNode函数

把所有代码翻译为python来表示相同的功能:

image

代码首先从数据流中读取一字节的内容,然后将其移动到char_data,然后用函数read_list_size读取入数据流的列表大小。

然后调用token_byte获取另一个字节,token_byte会被传递给read_string

image

图36: ReadString函数

代码使用了getToken,并把参数传递到token数组的一个位置上:

image

图37: getToken函数

这是通信中WhatsApp发送的第一项,然后翻译readString函数中的所有函数,并继续调试:

image

然后就可以看到readNode函数中的readAttributes函数:

image

图38: readAttribues函数

readAttributes函数会继续从数据流中读取字节,并通过相同的token列表进行语法分析:

image

WhatsApp发送的第二个参数是消息的真实动作,WhatsApp发送{add:”replay”}表示新消息到达。
继续查看readNode函数代码,看到发送的消息的三个部分:

  • 相同的token
  • 相同的token属性
  • protobuf编码的消息

image

图39: 解密的数组

接下来要处理的是第三个参数protobuf,然后解密。

为了了解Whatsapp使用的protobuf方案,将其复制到空的.proto文件中:

image

图40: protobuf

索引也可以从Whatsapp protobuf方案中复制,并编译为python protobuf文件:

image

然后用python函数将protobuf翻译为json。

image

图41: 解密的数据

在扩展中应用之后就可以解密通信了:

image

图42: 使用扩展来解密数据

WhatsApp加密(加密收到的消息)

加密的过程与解密过程相似,就是顺序不同,这里要逆向的是writeNode函数:

image

图43: writeNode 函数

image

图44: writeNode函数

有了token和token属性之后,那么需要做的与readNode中一样:

image

首先,检查节点长度是不是3;然后给token属性数乘2,并传递给writeListStartwriteListStart会写类别字符的开始和列表大小,与readNode一样:

image

然后进入writeString,可以看到翻译为X的action和token index中action的位置:

image

图45: writeToken函数

翻译代码和所有函数:

image

writeAttributes会翻译属性,之后由writeChildren翻译真实数据。

image

图46: writeChildren函数

翻译函数:

image

解密和解密消息如下:

image

为了简化加密的过程,研究人员修改了真实的writeChildren函数,然后添加了另一个实例来让加密过程更简单:

image

结果就是加密和解密的收到的消息。

解密发送的数据请查看github代码:
https://github.com/romanzaikin/BurpExtension-WhatsApp-Decryption-CheckPoint


          azure data backup      Cache   Translate Page   Web Page Cache   
azure data backup and restore in python (Budget: $250 - $750 USD, Jobs: Amazon Web Services, Django, Elasticsearch, Python, Software Architecture)
          Principal Data Scientist - Rio Tinto - Montréal, QC      Cache   Translate Page   Web Page Cache   
Python (Sci-kit Learn, numpy, pandas, Tensorflow, Keras), R, Matlab, SQL. Scientifique principal des données — Montréal....
From Rio Tinto - Wed, 01 Aug 2018 11:02:26 GMT - View all Montréal, QC jobs
          Lead Software Engineer, AI/data science - IVADO Labs - Montréal, QC      Cache   Translate Page   Web Page Cache   
Understanding of one or more of the modern AI/data science and data manipulation programming languages/libraries (e.g., Python, Scikit-Learn, Pandas, etc.)....
From IVADO Labs - Sat, 07 Jul 2018 03:11:55 GMT - View all Montréal, QC jobs
          Senior Consultant – Data Analytics - EY - Montréal, QC      Cache   Translate Page   Web Page Cache   
Proficiency in Python for numerical/statistical programming (including Numpy, Pandas, and Scikit-learn). EY's people in more than 150 countries are committed to...
From EY - Fri, 29 Jun 2018 00:11:13 GMT - View all Montréal, QC jobs
          Data Scientists / AI & Machine Learning Engineer - IVADO Labs - Montréal, QC      Cache   Translate Page   Web Page Cache   
Experience implementing AI/data science algorithms using one or more of the modern programming languages/frameworks (e.g., Python, Pandas, Scikit-learn,...
From IVADO Labs - Sat, 05 May 2018 03:10:45 GMT - View all Montréal, QC jobs
          Software Engineer/Full Stack Developer - IVADO Labs - Montréal, QC      Cache   Translate Page   Web Page Cache   
Understanding of one or more of the modern AI/data science and data manipulation programming languages/libraries (e.g., Python, Scikit-Learn, Pandas, etc.)....
From IVADO Labs - Sat, 05 May 2018 03:10:42 GMT - View all Montréal, QC jobs
          Open Shift Architect      Cache   Translate Page   Web Page Cache   
TX-Austin, Required Skill 10+ years application architecture experience at enterprise level 5 years of software development experience in Java, C+ or .NET Experience in troubleshooting and tuning of enterprise applications and/or data platforms (in-memory databases, messaging systems, NoSQL databases, etc.) preferred Experience in developing software automation solutions with Python, Golang or Perl preferred
          IPOB SHUNS OHANEZE AND VOWS ON "NO REFERENDUM NO ELECTION"       Cache   Translate Page   Web Page Cache   
IPOB SHUNS OHANEZE AND VOWS ON "NO REFERENDUM NO ELECTION"

IPOB press release
9/07/2018

Following the inability of Ohaneze Ndigbo and South East governors to de-proscribe the activities of IPOB in line with the stated objectives of the Prof. Ben Nwabueze convened peace initiative, the leadership of IPOB hereby suspend, with immediate effect, any future contact or participation in meetings involving Ohaneze Ndigbo leadership or South East governors. The ceasefire announced as a gesture of goodwill after the first Enugu meeting is now officially revoked with no hope of future reinstatement.

Mindful of posterity and our deep respect for Prof. Nwabueze, who personally pleaded that we suspend all negative comments and verbal attacks against Ohaneze Ndigbo. We duly obliged by announcing a binding cessation of all hostilities against corrupt Igbo leaders despite the fact they knowingly and actively connived with Fulani overlords to kill IPOB activists, invade the home of our leader with the intention to kill him out of petty jealousy over his popularity and acceptance by the people. That our leader and his parents are still missing is a direct consequence of the connivance of Igbo leaders with their Fulani masters to destroy the agitation for Biafra.

IPOB towed this path of peaceful dialogue because of our deep respect for our more honourable and truthful elders whose wise counsel compelled us to give peace a chance. What has transpired over the past two months, culminating in the failed meeting of the 8th of August 2018 in Enugu, have clearly demonstrated the unwillingness of Ohaneze Ndigbo and South East governors to breakaway from their slavish master-servant relationship with the Fulani caliphate. These unrepentant collaborators and instigators of Operation Python Dance abused the privilege IPOB ceasefire afforded them by seeking to turn the meeting into an endorsement of an Igbo candidate for PDP vice presidential slot in the coming 2019 elections. Under such circumstance, we felt the meeting had lost its purpose hence the need for us to pull out of any talks which consequently brought the entire process to an end.

As a result of these meetings, any lingering doubt that Ohaneze Ndigbo, South East governors and some Abuja based political jobbers instigated Operation Python Dance, has been dispelled by their inability to lift the proscription they themselves imposed on IPOB. Fulani governors  of the north never proscribed murderous Boko Haram or Fulani terror herdsmen but Igbo governors and politicians hastily proscribed the only movement, IPOB, fighting for the masses because they want to continue enjoying Abuja patronage. As the masses continue to wallow in abject poverty and oppression, these Igbo sell-out leaders, their families and cronies will continue to live large courtesy of their all powerful Fulani masters who can make or break them.

Having dutifully attended three meetings in the past 7 weeks with nothing to show for it, these collaborators and traitors have become more emboldened in their subterfuge to the extent of boasting to observers sent by Aso Rock to the Enugu meeting of the 8th of August that IPOB has been thoroughly weakened and reduced to begging to be de-proscribed.

Sadly, some of the attendees of the shameful emergency Igbo elders meeting convened to endorse Operation Python Dance at Nike Lake Resort a day after the deadly invasion of Afaraukwu by Nigerian soldiers, were also present at the Enugu meeting. In due course we shall hold them accountable for their actions. They lacked the courage then to issue a statement condemning Operation Python Dance and they still do so till today because they were all in the know or actively involved in the planning of the massacre of their own people because of money. 

It has dawned on the worldwide family of IPOB that this well intended peaceful dialogue, from inception, was a fraud, a charade and political subterfuge designed to get IPOB to participate in 2019 elections and nothing more. Ohaneze Ndigbo and their allies in Abuja saw it as an opportunity to prove to their northern Fulani controllers that they are still in charge of the South East. Nothing could be further from the truth. IPOB is the people and the people is IPOB.

This game plan of seeking to subdue IPOB through a series of endless and meaningless meetings in the name of forging a common front or lasting peace, is dead on arrival. IPOB will no longer participate in any meeting or dialogue involving Ohaneze Ndigbo. They mistook our magnanimity for weakness. That unfortunately, is the greatest mistake they would ever make in their lives.

This bunch will go down as the worst set of leaders in the history of the Igbo race. For 48 years they have been actively involved in trading the future of their children and unborn generations for peanuts. Under their stewardship, nothing worthwhile has been accomplished for the masses since after the war. They are ravenous, predatory and devoid of conscience. We gave them an opportunity to redeem themselves but they failed woefully. This generation of IPOB and history shall judge them very harshly.

Henceforth, anybody seeking to deal with IPOB mustcontact our world headquarters in Germany where their enquiry will be directed to the appropriate quarters. We would no longer recognise any third party involvement. This will ensure we avoid a repeat of what just transpired with this failed dialogue with Ohaneze Ndigbo.

We wish to thank Prof. Nwabueze immensely for all his efforts and to convey our deep appreciation, regard and respect for him. We also urge him to hold tenaciously to his publicly stated stance that de-proscription of IPOB and the whereabouts of Mazi Nnamdi Kanu  is a prerequisite for any peaceful  settlement. IPOB hereby reinstate it's avowed commitment to conduct a referendum as soon as possible.

COMRADE EMMA POWERFUL MEDIA AND PUBLICITY SECRETARY FOR IPOB.
          DEADLOCK - IPOB CANCELS FURTHER PEACE SUMMIT WITH OHANEZE       Cache   Translate Page   Web Page Cache   
DEADLOCK - IPOB CANCELS FURTHER PEACE SUMMIT WITH OHANEZE

IPOB press release
9/07/2018

Following the inability of Ohaneze Ndigbo and South East governors to de-proscribe the activities of IPOB in line with the stated objectives of the Prof. Ben Nwabueze convened peace initiative, the leadership of IPOB hereby suspend, with immediate effect, any future contact or participation in meetings involving Ohaneze Ndigbo leadership or South East governors. The ceasefire announced as a gesture of goodwill after the first Enugu meeting is now officially revoked with no hope of future reinstatement.

Mindful of posterity and our deep respect for Prof. Nwabueze, who personally pleaded that we suspend all negative comments and verbal attacks against Ohaneze Ndigbo. We duly obliged by announcing a binding cessation of all hostilities against corrupt Igbo leaders despite the fact they knowingly and actively connived with Fulani overlords to kill IPOB activists, invade the home of our leader with the intention to kill him out of petty jealousy over his popularity and acceptance by the people. That our leader and his parents are still missing is a direct consequence of the connivance of Igbo leaders with their Fulani masters to destroy the agitation for Biafra.

IPOB towed this path of peaceful dialogue because of our deep respect for our more honourable and truthful elders whose wise counsel compelled us to give peace a chance. What has transpired over the past two months, culminating in the failed meeting of the 8th of August 2018 in Enugu, have clearly demonstrated the unwillingness of Ohaneze Ndigbo and South East governors to breakaway from their slavish master-servant relationship with the Fulani caliphate. These unrepentant collaborators and instigators of Operation Python Dance abused the privilege IPOB ceasefire afforded them by seeking to turn the meeting into an endorsement of an Igbo candidate for PDP vice presidential slot in the coming 2019 elections. Under such circumstance, we felt the meeting had lost its purpose hence the need for us to pull out of any talks which consequently brought the entire process to an end.

As a result of these meetings, any lingering doubt that Ohaneze Ndigbo, South East governors and some Abuja based political jobbers instigated Operation Python Dance, has been dispelled by their inability to lift the proscription they themselves imposed on IPOB. Fulani governors  of the north never proscribed murderous Boko Haram or Fulani terror herdsmen but Igbo governors and politicians hastily proscribed the only movement, IPOB, fighting for the masses because they want to continue enjoying Abuja patronage. As the masses continue to wallow in abject poverty and oppression, these Igbo sell-out leaders, their families and cronies will continue to live large courtesy of their all powerful Fulani masters who can make or break them.

Having dutifully attended three meetings in the past 7 weeks with nothing to show for it, these collaborators and traitors have become more emboldened in their subterfuge to the extent of boasting to observers sent by Aso Rock to the Enugu meeting of the 8th of August that IPOB has been thoroughly weakened and reduced to begging to be de-proscribed.

Sadly, some of the attendees of the shameful emergency Igbo elders meeting convened to endorse Operation Python Dance at Nike Lake Resort a day after the deadly invasion of Afaraukwu by Nigerian soldiers, were also present at the Enugu meeting. In due course we shall hold them accountable for their actions. They lacked the courage then to issue a statement condemning Operation Python Dance and they still do so till today because they were all in the know or actively involved in the planning of the massacre of their own people because of money. 

It has dawned on the worldwide family of IPOB that this well intended peaceful dialogue, from inception, was a fraud, a charade and political subterfuge designed to get IPOB to participate in 2019 elections and nothing more. Ohaneze Ndigbo and their allies in Abuja saw it as an opportunity to prove to their northern Fulani controllers that they are still in charge of the South East. Nothing could be further from the truth. IPOB is the people and the people is IPOB.

This game plan of seeking to subdue IPOB through a series of endless and meaningless meetings in the name of forging a common front or lasting peace, is dead on arrival. IPOB will no longer participate in any meeting or dialogue involving Ohaneze Ndigbo. They mistook our magnanimity for weakness. That unfortunately, is the greatest mistake they would ever make in their lives.

This bunch will go down as the worst set of leaders in the history of the Igbo race. For 48 years they have been actively involved in trading the future of their children and unborn generations for peanuts. Under their stewardship, nothing worthwhile has been accomplished for the masses since after the war. They are ravenous, predatory and devoid of conscience. We gave them an opportunity to redeem themselves but they failed woefully. This generation of IPOB and history shall judge them very harshly.

Henceforth, anybody seeking to deal with IPOB mustcontact our world headquarters in Germany where their enquiry will be directed to the appropriate quarters. We would no longer recognise any third party involvement. This will ensure we avoid a repeat of what just transpired with this failed dialogue with Ohaneze Ndigbo.

We wish to thank Prof. Nwabueze immensely for all his efforts and to convey our deep appreciation, regard and respect for him. We also urge him to hold tenaciously to his publicly stated stance that de-proscription of IPOB and the whereabouts of Mazi Nnamdi Kanu  is a prerequisite for any peaceful  settlement. IPOB hereby reinstate it's avowed commitment to conduct a referendum as soon as possible.

COMRADE EMMA POWERFUL MEDIA AND PUBLICITY SECRETARY FOR IPOB.
          BIAFRA: IPOB INDEFINITELY DISASSOCIATE SELF FROM OHANEZE NDI IGBO AND SOUTH EAST GOVERNORS       Cache   Translate Page   Web Page Cache   
IPOB INDEFINITELY DISASSOCIATE SELF FROM OHANEZE NDI IGBO AND SOUTH EAST GOVERNORS

IPOB press release
9/07/2018

Following the inability of Ohaneze Ndigbo and South East governors to de-proscribe the activities of IPOB in line with the stated objectives of the Prof. Ben Nwabueze convened peace initiative, the leadership of IPOB hereby suspend, with immediate effect, any future contact or participation in meetings involving Ohaneze Ndigbo leadership or South East governors. The ceasefire announced as a gesture of goodwill after the first Enugu meeting is now officially revoked with no hope of future reinstatement.

Mindful of posterity and our deep respect for Prof. Nwabueze, who personally pleaded that we suspend all negative comments and verbal attacks against Ohaneze Ndigbo. We duly obliged by announcing a binding cessation of all hostilities against corrupt Igbo leaders despite the fact they knowingly and actively connived with Fulani overlords to kill IPOB activists, invade the home of our leader with the intention to kill him out of petty jealousy over his popularity and acceptance by the people. That our leader and his parents are still missing is a direct consequence of the connivance of Igbo leaders with their Fulani masters to destroy the agitation for Biafra.

IPOB towed this path of peaceful dialogue because of our deep respect for our more honourable and truthful elders whose wise counsel compelled us to give peace a chance. What has transpired over the past two months, culminating in the failed meeting of the 8th of August 2018 in Enugu, have clearly demonstrated the unwillingness of Ohaneze Ndigbo and South East governors to breakaway from their slavish master-servant relationship with the Fulani caliphate. These unrepentant collaborators and instigators of Operation Python Dance abused the privilege IPOB ceasefire afforded them by seeking to turn the meeting into an endorsement of an Igbo candidate for PDP vice presidential slot in the coming 2019 elections. Under such circumstance, we felt the meeting had lost its purpose hence the need for us to pull out of any talks which consequently brought the entire process to an end.

As a result of these meetings, any lingering doubt that Ohaneze Ndigbo, South East governors and some Abuja based political jobbers instigated Operation Python Dance, has been dispelled by their inability to lift the proscription they themselves imposed on IPOB. Fulani governors  of the north never proscribed murderous Boko Haram or Fulani terror herdsmen but Igbo governors and politicians hastily proscribed the only movement, IPOB, fighting for the masses because they want to continue enjoying Abuja patronage. As the masses continue to wallow in abject poverty and oppression, these Igbo sell-out leaders, their families and cronies will continue to live large courtesy of their all powerful Fulani masters who can make or break them.

Having dutifully attended three meetings in the past 7 weeks with nothing to show for it, these collaborators and traitors have become more emboldened in their subterfuge to the extent of boasting to observers sent by Aso Rock to the Enugu meeting of the 8th of August that IPOB has been thoroughly weakened and reduced to begging to be de-proscribed.

Sadly, some of the attendees of the shameful emergency Igbo elders meeting convened to endorse Operation Python Dance at Nike Lake Resort a day after the deadly invasion of Afaraukwu by Nigerian soldiers, were also present at the Enugu meeting. In due course we shall hold them accountable for their actions. They lacked the courage then to issue a statement condemning Operation Python Dance and they still do so till today because they were all in the know or actively involved in the planning of the massacre of their own people because of money. 

It has dawned on the worldwide family of IPOB that this well intended peaceful dialogue, from inception, was a fraud, a charade and political subterfuge designed to get IPOB to participate in 2019 elections and nothing more. Ohaneze Ndigbo and their allies in Abuja saw it as an opportunity to prove to their northern Fulani controllers that they are still in charge of the South East. Nothing could be further from the truth. IPOB is the people and the people is IPOB.

This game plan of seeking to subdue IPOB through a series of endless and meaningless meetings in the name of forging a common front or lasting peace, is dead on arrival. IPOB will no longer participate in any meeting or dialogue involving Ohaneze Ndigbo. They mistook our magnanimity for weakness. That unfortunately, is the greatest mistake they would ever make in their lives.

This bunch will go down as the worst set of leaders in the history of the Igbo race. For 48 years they have been actively involved in trading the future of their children and unborn generations for peanuts. Under their stewardship, nothing worthwhile has been accomplished for the masses since after the war. They are ravenous, predatory and devoid of conscience. We gave them an opportunity to redeem themselves but they failed woefully. This generation of IPOB and history shall judge them very harshly.

Henceforth, anybody seeking to deal with IPOB mustcontact our world headquarters in Germany where their enquiry will be directed to the appropriate quarters. We would no longer recognise any third party involvement. This will ensure we avoid a repeat of what just transpired with this failed dialogue with Ohaneze Ndigbo.

We wish to thank Prof. Nwabueze immensely for all his efforts and to convey our deep appreciation, regard and respect for him. We also urge him to hold tenaciously to his publicly stated stance that de-proscription of IPOB and the whereabouts of Mazi Nnamdi Kanu  is a prerequisite for any peaceful  settlement. IPOB hereby reinstate it's avowed commitment to conduct a referendum as soon as possible.

COMRADE EMMA POWERFUL MEDIA AND PUBLICITY SECRETARY FOR IPOB.
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av G      Cache   Translate Page   Web Page Cache   
"Politikerna kan inte blunda för mansöverskottet längre" https://www.expressen.se/ledare/patrik-kronqvist/politikerna-kan-inte-blunda-for-mansoverskottet-langre/
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av G      Cache   Translate Page   Web Page Cache   
"Sedermera har de Swedavia-anställda tagit bort sina kommentarer från Facebook. De har även rensat bort alla kopplingar till Swedavia i sina profiler på Facebook samt blockerat undertecknad." Vuxet beteende. :)
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av G      Cache   Translate Page   Web Page Cache   
"Regeringen övergav likabehandlingsprincipen inför lagen Under mandatperioden har ett antal grundlagsförändringar genomförts. De flesta har passerat utan rapportering och utan debatt i det traditionella nyhetsflödet. Eftersom grundlagsberedningarna har en tradition av att, om inte vara eniga, så vara i praktiken eniga om inriktningen av en grundlagsförändring. Det finns bland politiska journalister inte någon tradition att bevaka våra grundlagars utveckling trots att det är grundlagarna som utgör grundnormen för övrig lagstiftning." https://ledarsidorna.se/2018/08/regeringen-overgav-likabehandlingsprincipen-infor-lagen/
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av G      Cache   Translate Page   Web Page Cache   
https://twitter.com/barnrattsbyran/status/1027530329337290752
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av G      Cache   Translate Page   Web Page Cache   
"Nyheter Idag avslöjar: Elin Ersson stoppade utvisning av hustrumisshandlare som VILLE skickas hem" https://nyheteridag.se/nyheter-idag-avslojar-elin-ersson-stoppade-utvisning-av-kvinnomisshandlare-som-bett-om-att-bli-skickad-tillbaka/
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av http://www.gertrud1.se/blogg/      Cache   Translate Page   Web Page Cache   
Det finns många orsaker till att kvinnor är sämre brandmän, poliser, kriminalvårdare. Vad det gäller dörren där så handlar det om både vikt, teknik, men också om smärttålighet. Det funkar inte att vara rädd att göra illa sig. Eller att inte vara assertive. Det som är ännu värre är att män har sina beskyddaregenskaper, så det kommer vara två brandmän, poliser, kriminalvårdare som inte är fullt stridsdugliga eftersom mannen gör kvinnans arbete också. Detta har vi sett inte minst i krig, där män tappar fokus och skyddar de kvinnliga soldaterna. Jag har själv haft ett manligt jobb, men jag var alltid tvungen att klara mig själv och samma regler och krav på prestation gällde också mig. Det var "leverera eller gå hem". Det är så kvinnor får självkänsla, för det är det man klarar som gör att man börjar lita till sig själv. Inte en högre lön. Inte kvotering. När man vet att det man gör enbart är beroende av en själv, det är då man får självkänsla. Kvinnor idag går helt fel väg. De är små prinsessor som sitter på rumpan och kräver.
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av Frida Gustavlin      Cache   Translate Page   Web Page Cache   
Det känns nästan som de mest trovärdiga teorierna kring 9/11, de som påvisar att det i själva verket inte var några flygplan som kraschade, utan girigia judar på Wall street, påhejade av Iluminati och fan och hans moster som var framme förstörde. Och jag som trodde mig tro att det var en hel hög med Polacker i Sverige och släckte bränder, plus en rad flygfordon från andra länder, allt eftersom vi inte rådde på brasorna själva, eftersom vi inte själva köpt några brandsläckningsflygplan. Men jag hade så klart helt fel, och det är ju något som jag blivit inbillad av det hemliga och allt behärskande ordernssällskapet vid namn patriarkatet. Tack ändå för att det finns pålitlig faktakontroll i det här landet!
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av weasel      Cache   Translate Page   Web Page Cache   
Mycket bra att det pågår krig och förföljelse i stora delar av världen - för om det gjorde det skulle ju inte de svenska samhällena vitaliseras. Hoppas vid gud att det aldrig blir världsfred för då får vi ju inte mer mångfald och arbetskraft :(
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av weasel      Cache   Translate Page   Web Page Cache   
Instämmer inte, Nightrunner. Jag tycker att kvinnor som vill bli brandmän och har de rätta egenskaperna ska få bli det - precis som de män som vill bli dagisfröknar och har de rätta egenskaperna ska få bli det. Naturligtvis ska man under inga omständigheter sänka fys-kraven för att få bli brandman, men de kvinnor som klarar fys- kraven får mer än gärna bli brandkvinnor.
          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av Dreadlock      Cache   Translate Page   Web Page Cache   
Till Pedersen : Slå upp Entropi i samband med termodynamikens 2:a lag. Så slapp du bli politiker i TRAFIKUTSKOTTET i riksdagen, nu sitter vi här och betalar din lön för en fullständig analfabet i de frågor du ska besluta om.
          Data Analyst      Cache   Translate Page   Web Page Cache   
IL-Chicago, Chicago, Illinois Skills : • Working Data Analysis experience with data discrepancies via Excel and fix data • Must have in depth experience with both Data Analysis and solutioning to solve data discrepancies • Strong analytical skills • SQL and AWS experience is required • Experience with Python would be huge plus Description : Multiple Opportunity for Data Analyst with large Financial Client in
          Python Resurrects Dot Matrix Printing      Cache   Translate Page   Web Page Cache   

These days a printer — especially one at home — is likely to spray ink out of nozzles. It is getting harder to find home laser printers, and earlier printer technologies such as dot matrix are almost gone from people’s homes even if you’ll still see a few printing multipart forms in some offices.

[Thomas Winningham] bought an old Commodore dot matrix printer in a fast food parking lot for $20. How hard could it be to get it working? How hard, indeed. Check out the video below to see the whole adventure. The principle behind the printer is simple …read more


          Programmer at Sahara Group      Cache   Translate Page   Web Page Cache   
Sahara Group nutures businesses in the energy sector. These companies operate essentially within the energy industry and its associated sub-sectors. The Group consists of individuals, who are determined to make a positive impact on the business environment.Qualification/ Experience B.Sc. degree in Computer science or related field 2-4years application development experience Experience building and integrating applications Experience in Test- Driven development Knowledge /Skills: Proficient in Python, PHP and JavaScript Working Knowledge of Java, C++ and C#, HTML, CSS Familiar with concepts of MCV, JDBC, and RESTful API development and web security Proficient in the use of Microsoft Office Suite (World, Excel, PowerPoint).
          EOD Support Officer at GVA Partners      Cache   Translate Page   Web Page Cache   
Growth in Value Alliance (GV Alliance) Partners - Our client, a Lagos based Bank, is currently recruiting suitably qualified candidates to fill the position below:Job Responsibilities Running EOD, EOM and EOY processes. Providing support to users on banking application related issues. Generating customized reports from banking application to users. Ensuring the banking application services are optimized. Training of operations and bank application related processors. Interfacing with application development staffs on integrating new applications with the banking application. Professional Requirements Core: Application Support and maintainance. Knowledge of banking operations. Solution architecture. Information Security. Release management. Programming skills ( .net, Java, Python) /software development. Non-core: Project Planning and control. Service level management. Educational Qualification First degree in Computer Science, Engineering or other related fields. Experience At least 1 year post NYSC experience.
          The Edge Search: Bluehost Hosting 2018 Review       Cache   Translate Page   Web Page Cache   

Bluehost is one of the most affordable and reputable web hosting companies in the world. Established in 2003, they continue to grow and attract more than 20,000 new customers each month.

Known for rock-solid reliability, Bluehost's shared hosting comes complete with generous disk space and bandwidth, free domain name plus an array of additional hosting features such as automated backups and one-click WordPress installation.

By always going out of their way to help their customers, you can be sure of receiving all the technical support you need

Bluehost Features

Since the beginning of 2003, Bluehost has always provided high-quality service while keeping pace with the technical developments and improvements in the industry.

As the needs of webmasters have evolved, so too has the Bluehost offering. Their reputation for reliability and quality service has been well earned and is why they remain a leader in the ultra-competitive web hosting industry.
  1. Free Domain Name Included
  2. Unlimited Bandwidth
  3. Unlimited Disk Space
  4. Unlimited Add-On Websites
  5. 30 Day Money Back Guarantee

Visit Bluehost.com - 30 Day Money Back Guarantee. It's Risk Free!

Bluehost's outstanding service is combined with a feature-packed offering and industry leading uptimes. This is all possible due to their state-of-the-art network infrastructure. Their hi-tech data centre is very impressive and boasts Internet connectivity over their OC-48 connection at an incredible 2GB/sec bandwidth.

Quad processor servers, 24/7 monitoring, a diesel-powered backup generator and mirrored storage backups round out the data centre's notable list of features.
Total Domains:
2,163,617
  • .com
    1,810,607
  • .org
    167,674
  • .net
    130,598
  • .us
    21,703
  • .info
    20,392
  • .biz
    12,643


Even though Bluehost's basic hosting package is on a shared server, the specifications are far from typical when compared to industry standards for shared hosting plans.


Bluehost's servers run on 64-bit Linux distributions. What's more, excessive CPU load and server slowdowns are non-existent on this setup while super-fast site performance is maintained even at peak usage times. These specs are hard to match and leave Bluehost's competitors trailing in their wake.

Visit Bluehost.com - 30 Day Money Back Guarantee. It's Risk-Free!

Control Panel

For back-end administration, Bluehost provide the industry favourite cPanel interface. With its comprehensive features and intuitive design, webmasters have everything they need to easily launch and maintain their sites.

Furthermore, Bluehost's own Page Wizard application enables professional looking Web pages to be created with just a few clicks. Web-based file management and script support for Fantastico are just a few of the many other features that cPanel boasts.

Scripts

SimpleScripts, Mojo and Fantastico support enables users to quickly install a wide variety of popular software packages such as WordPress, Drupal, and Joomla. Never before has it been so easy to install blogs, forums, image galleries, polls and content management systems.

Fantastico de Luxe popularity amongst webmasters is a testament to its simple operation, and its inclusion in Bluehost's plan adds even more value to their already featured packed offering. For more advanced users who prefer installing scripts manually, Bluehost supports all popular scripting languages including:
  1. CGI
  2. Python
  3. PERL 5
  4. PHP4 & PHP5
  5. Ruby on Rails
  6. CRON jobs, Apache .htaccess and custom php.ini are also supported
WordPress is now one of the most widely used blogging and content management system platforms in the world and it's worth noting that Bluehost offers 1-Click installation of WordPress with all their hosting packages.


Visit Bluehost.com - 30 Day Money Back Guarantee. It's Risk-Free!

Uptime & Performance


Feature-laden, value for money hosting packages are important, but nothing is of greater importance than your web host providing basic network reliability. Claims of 99.9% uptime are all well and good but only the select few can back up their promises with actual data.


Bluehost is proud of their network integrity and list it as one of their key features. No longer do webmasters need to worry about losing business because their site is down. In fact, downtime is one of the most prevalent reasons why site owners shift from mediocre providers to a company like Bluehost who take their commitment to 99.9% up-time very seriously.

Independent testing on a site hosted by Bluehost revealed only thirty minutes of total downtime over a 90-day period. What's really impressive about this result is that, out of the total downtime, all thirty minutes were identified as planned maintenance. Bluehost's planned downtime is always scheduled during periods when Web traffic is off-peak to keep impact to a minimum.

Visit Bluehost.com - 30 Day Money Back Guarantee. It's Risk Free!


A webmaster will always find any amount of downtime unpalatable but 30 minutes in 90 days equates to 99.93% uptime, which is a highly impressive result. Given that performance is a key factor in the choice of a web hosting provider, we decided to undertake some performance testing of our own. We tested the page load time of the Bluehost homepage .

Bluehost Results: 

Homepage is loaded in 3.4 seconds
Homepage is fully loaded in 5.1 seconds

(Test server region: Dallas, USA. Connection: Cable (5/1 Mbps, 30ms). Date: 11 August 2013)

Help & Support

Bluehost offers customers several ways to access their technical support, one of which is the Bluehost Help Center. The Help Center contains a complete database of troubleshooting issues and fixes, together with instructions for hundreds of site-management tasks. It's the quickest and easiest way to get minor issues resolved.
Alternatively, clients with more complex problems can submit a help ticket through the Help Center and will receive email or live support as needed. Tickets are always answered in less than 12 hours, with most being addressed within just 1-2 hours.

Lastly, Live Phone support is also offered 24/7. This allows customers to speak directly with a technical expert and have all their questions comprehensively answered. Clients outside of the United States have not been forgotten either with additional phone numbers being provided specifically for them.

Bluehost Plans & Pricing

Bluehost have a straightforward approach to shared hosting. They only offer Linux shared server hosting on two simple plans; a standard hosting plan and a professional hosting plan (Bluehost also offers VPS, Dedicated Servers and Managed WordPress hosting).

STANDARD HOSTING PLAN

  • $3.95 per month
  • Unlimited Disk storage space
  • Unlimited Monthly Bandwidth
  • Unlimited Addon Domains (One free domain registration with account)
  • Unlimited Sub-domains
  • Unlimited Parked Domains
  • International Domains Supported
  • 1000 FTP Accounts (anonymous FTP support included)
  • Unlimited IMAP or POP3 E-mail Accounts
  • Secure IMAP Email Support
  • Unlimited Forwarding Email Addresses
  • Spam-Assassin Free-mail Filtering
  • cPanel Control Panel
  • 50 Postgre SQL or MySQL Databases
  • Frontpage 2000/2002/2003 Extensions
  • Ruby on Rails, CGI, Python, Perl 5, PHP 4&5 Scripts
  • Fully supported Server Side Includes (SSI)
  • SSH Shell Access
  • Fantastico Support
  • CRON Access and .htaccess
  • Free 1-Click Script Install
  • $100 Google Advertising Offer
  • 24/7 Phone, Chat & Email Support

Sign Up Now - Risk free - 30 day money back guarantee


PRO HOSTING PLAN

  • $19.95 per month
  • Unlimited Disk storage space
  • Unlimited Monthly Bandwidth
  • Unlimited Addon Domains (One free domain registration with account)
  • Unlimited Sub-domains
  • Unlimited Parked Domains
  • International Domains Supported
  • 1000 FTP Accounts (anonymous FTP support included)
  • Unlimited IMAP or POP3 E-mail Accounts
  • Secure IMAP Email Support
  • Unlimited Forwarding Email Addresses
  • Spam-Assassin Free-mail Filtering
  • cPanel Control Panel
  • 50 Postgre SQL or MySQL Databases
  • Frontpage 2000/2002/2003 Extensions
  • Ruby on Rails, CGI, Python, Perl 5, PHP 4&5 Scripts
  • Fully supported Server Side Includes (SSI)
  • SSH Shell Access
  • Fantastico Support
  • CRON Access and .htaccess
  • Free 1-Click Script Install
  • $100 Google Advertising Offer
  • 24/7 Phone, Chat & Email Support
  •  More CPU, Memory and Resources Added
  •  SiteBackup Pro Included
  •  Free Dedicated IP Address
  •  Free SSL Certificate
  •  Free Domain Name Privacy
  •  10 Free Postini
          Python Developer - Trigyn - Montréal, QC      Cache   Translate Page   Web Page Cache   
Experience working with multiple SCM systems, code review systems, build tools, test frameworks, code quality tools, CI systems, and IDEs....
From Trigyn - Thu, 09 Aug 2018 22:02:32 GMT - View all Montréal, QC jobs
          Java / Python Developer - Trigyn - Montréal, QC      Cache   Translate Page   Web Page Cache   
Familiar with the tools of the trade experience working with multiple SCM systems, code review systems, build tools, test frameworks, code quality tools, CI...
From Trigyn - Fri, 29 Jun 2018 04:02:00 GMT - View all Montréal, QC jobs
          Senior Full Stack Developer - Python - Perficient - National, WV      Cache   Translate Page   Web Page Cache   
MongoDB and PostgreSQL experience are highly desirable. At Perficient you’ll deliver mission-critical technology and business solutions to Fortune 500 companies...
From Perficient - Thu, 28 Jun 2018 20:47:34 GMT - View all National, WV jobs
          Project Night      Cache   Translate Page   Web Page Cache   

photoDFW Pythoneers

We're joining with Co-Op Complete to bring you Project Night!

Below is from the CoOp Complete Meetup Group.  The group's page is here:  http://www.meetup.com...

Come join us to work on your coding project!

What to bring: 
-> Laptop 
-> Headphones (if you want some quiet)

How do the meetings work? 
Typically we start with a 5-10 minute stand-up meeting where everyone can give a brief (30-seconds or less) update on their project. This is also an opportunity for a new person to briefly introduce themselves and indicate what they’re working on. During the stand-up limited feedback is welcome but we try to keep it short so folks can spend a majority of their time working on their project.

After the stand-up we split up so folks can work on their project in whatever way is most productive. This is also an opportunity to give or solicit additional feedback or assistance about a project. For a lot of folks this is the only dedicated time of the week they have to work on personal coding projects so we want to try to make that time as valuable as possible.

To get in the building you will need to call us to get in. The number will be posted on the door at the entrance.

Dallas, TX - USA

Tuesday, September 18 at 6:30 PM

1

https://www.meetup.com/dfwpython/events/253653778/


          Developer - West, Inc. - Cheyenne, WY      Cache   Translate Page   Web Page Cache   
Leveraging .net framework, Java, Python, etc. C# is. Cheyenne or Laramie, WY....
From West, Inc. - Tue, 19 Jun 2018 10:23:47 GMT - View all Cheyenne, WY jobs
          IT Manager - Infrastructure - DISH Network - Cheyenne, WY      Cache   Translate Page   Web Page Cache   
Scripting experience in one or more languages (Python, Perl, Java, Shell). DISH is a Fortune 200 company with more than $15 billion in annual revenue that...
From DISH - Sun, 15 Jul 2018 05:30:30 GMT - View all Cheyenne, WY jobs
          Jr-Mid Level Software Engineer - IDEMIA - Morgantown, WV      Cache   Translate Page   Web Page Cache   
Knowledge or interest in multiple technology domains and languages e.g. Java, JavaScript, Go, Python, etc. As a software engineer for IDEMIA NSS, the successful...
From IDEMIA - Sun, 05 Aug 2018 08:52:20 GMT - View all Morgantown, WV jobs
          Electrical Engineer - 4D Tech Solutions, Inc. - Morgantown, WV      Cache   Translate Page   Web Page Cache   
Proficient in C, C++, Python, Java, and/or shell script. 4D Tech Solutions is seeking a highly motivated entry-level software/electrical/test engineer to join... $85,000 - $105,000 a year
From Indeed - Tue, 26 Jun 2018 16:33:55 GMT - View all Morgantown, WV jobs
          Python Developer - Trigyn - Montréal, QC      Cache   Translate Page   Web Page Cache   
Experience working with multiple SCM systems, code review systems, build tools, test frameworks, code quality tools, CI systems, and IDEs....
From Trigyn - Thu, 09 Aug 2018 22:02:32 GMT - View all Montréal, QC jobs
          Java / Python Developer - Trigyn - Montréal, QC      Cache   Translate Page   Web Page Cache   
Familiar with the tools of the trade experience working with multiple SCM systems, code review systems, build tools, test frameworks, code quality tools, CI...
From Trigyn - Fri, 29 Jun 2018 04:02:00 GMT - View all Montréal, QC jobs
          Best Python Training Institute In BTM Bangalore-Ascent      Cache   Translate Page   Web Page Cache   
Ascent providing high quality Python training at affordable fees is our core value. Best Python Training institute in BTM Bangalore offered with 100% placement assistance and certified experts.Call for free demo today -9035752162 or visit us our website.
          每周分享第 17 期      Cache   Translate Page   Web Page Cache   

这里记录过去一周,我看到的值得分享的东西,每周五发布。

(图片说明:2018年6月的气温与1951-1980年平均气温的对比,来自推特@SimonLeeWx

今天夏天,全球高温创纪录。日本韩国都是史上最高温,西班牙葡萄牙接近48度的欧洲高温纪录,北纬71度的一个挪威城市32度。要知道,北极圈是北纬66.5度,这就是说北极圈里面也是高温。上图是今年6月的气温与40年前的平均气温比较,可以看到都是偏高的,尤其是南北极远远偏高。

全球变暖已经是活生生的现实。科学家估计,目前的年平均气温比工业革命前已经升高了1度。《巴黎气候协议》的目标是,气温升高控制在2度,但是成功的希望据说只有5%。如果学过统计学,你就知道,5%的机会在统计学上可以视作不会发生。

千万不要觉得,2度不算什么,《纽约时报》描述了后果。

  • 升高2度:热带珊瑚礁灭绝,海平面上升几米,波斯湾不适合人类居住。
  • 升高3度:北极的森林和大多数沿海城市被淹没。
  • 升高4度:欧洲永久干旱, 中国、印度和孟加拉国大部分地区变成沙漠,美国科罗拉多河接近干涸,美国西南部将不适合居住。
  • 升高5度:人类文明终结。

联合国气候官员称,如果不采取任何行动,目前乐观的估计是全球会升高3度。

气温升高的主要原因是,人类大量消耗化石能源,温室气体(主要是二氧化碳)排放急剧增加。所谓温室气体,就是这一类气体有温室效应,可以让阳光进入温室,但是阻止热量散发出去。火星为什么不适合人类居住?一个原因就是它的表面没有温室气体,留不住热量,导致温度过低。地球现在的问题是,温室气体过多。

现在的计算是,如果人类排放10000亿吨二氧化碳,地球就会升高2度,目前人类已经排放了6237亿吨。有一个网站实时显示目前排放了多少亿吨。

根据计算,2036年3月7日,人类将达到1万亿吨排放量。更热的日子还在后面。

新闻

1、中国的二氧化碳排放

《纽约时报》报道,一个美国教授认为,按照中国现在的二氧化碳排放水平,人类无法达到《巴黎协定》规定的减排目标,即全球平均温度比工业化之前上升不超过2摄氏度,除非中国的排放量大幅下降。

中国是世界最大的二氧化碳排放国。2017年,中国排放了117亿吨的温室气体,占世界总量的四分之一,其中包括92亿吨的二氧化碳,超过美国和欧盟的总和。按照现在的减排速度,中国的二氧化碳排放量最晚将在2030年达到峰值,到那一年,中国五分之一的能源将来自非化石燃料来源。

2、美国的贫富分化

美国经济研究所发现,美国的贫富分化一直在扩大,已经达到了1928年以来的最高点。

现在最富有的1%家庭获得全部收入的22%,只比1928年的23.9%低一点。美国人的年收入至少需要42.2万美元,才能跻身前1%的高收入者。这只是全国平均数,一些州的门槛要高得多,比如康涅狄格州的1%门槛为70万美元。

这就是技术革命的一个社会后果,贫富分化不断扩大,中产阶级消失,财富日益集中在少数富豪手里。

3、开放式办公室

Ruby on Rails 的创始人 DHH 公开抨击,开放式办公室是一个极其糟糕的坏主意。

有人说,开放式办公室可以增加合作。DHH 说实际情况是,开放式办公导致面对面的互动直线下降,因为人们这时需要通过耳机来集中注意力,交流变成依靠即时消息或电子邮件。最糟糕的情况是,同一房间有不同部门的数十个人,销售、营销、客服、管理者、程序员、设计师都混在一起,他们一定会互相打扰。

开放式办公实际带来的不是合作,而是压力和冲突,但它仍然是技术公司的默认配置。我们强迫绝大多数不喜欢开放式办公的人接受这种配置,这为了什么?因为管理层喜欢这样的配置?因为它在照片中看起来不错?还是因为它会给访问办公室的陌生人留下深刻的印象?

4、皮质醇贴片

斯坦福大学的科学家发明了一种皮肤贴片,可以实时测量汗液里面的皮质醇含量。一般来说,我们觉得压力很大的时候,皮质醇分泌就会增加。

以前的测量方法都无法实时得到结果。现在我们可以实时知道人体的感受了,甚至可以找出人群里面最紧张的人,这种传感器为以后更有趣的发明奠定了基础。

5、微型机器人竞赛

美国国防部高级研究局(DARPA)发起了一次微型机器人竞赛。现在大多数机器人都是模仿人类的外形,但是昆虫大小的机器人也有巨大的用处。DARPA 要求,这次参赛的机器人重量小于1克,体积小于1立方厘米。DARPA 为所有参赛者提供3200万美元资助,比赛可能在明年3月举行。

6、最古老的面包

考苦学家在约旦的沙漠里面,发现了一个石头砌的炉子,里面居然还有古人烤剩下来的面包屑。这是现存最古老的面包。

上图是显微镜下的面包屑,可以看到面包经过了充分发酵。

年代鉴定以后,所有人都大吃一惊,这个面包炉子距今14000年。那时人类甚至没有开始农业耕作,因此这个面包不是用小麦粉做的,而是来自野生的谷物。

7、代码搜索

微软的 Bing 搜索引擎新增了代码搜索功能,你问一个编程问题,现在可以直接给出示例代码。

8、机器狗 Spotmini

波士顿动力公司在成立16年后,终于要发售第一款产品了:一个类似小狗的四足机器人,高91厘米,重25公斤。这家公司以前的产品,都是供给美国国防部,这是第一款商业产品,预计明年上市。

这个机器人的功能非常惊人,可以自动识别道路,避开障碍,上楼下楼,跌倒还能爬起,机械臂可以拿起放下各种东西,请看视频

9、博士就业危机

加拿大媒体报道,该国的博士研究生只有五分之一能够找到教职。也就是说,80%的博士只能去企业界或转行,事实上确实有很多博士找不到工作,这篇报道里面就有博士改行当插花师或建筑工。

我想,这样的危机在中国一定更严重。因为中国的博士生规模世界第一,但是国内高校的扩张期已经过去了,现在想进高校当老师其实挺难的。如果转行的话,对个人和社会都是一种浪费。如果你有志攻读博士,一定要慎重。

10、AMD 游戏主机

上海的 ChinaJoy 展会上,AMD 宣布与中山小霸王公司合作推出新的游戏主机,CPU 是锐龙,GPU 是Vega,搭配 8GB GDDR5显存,性能将超过索尼 PS4 及微软 Xbox One 游戏机。

这台游戏机搭配 128GB SSD + 1TB HDD 硬盘,支持正版Windows 10系统,售价4998元,并有中文独占游戏 Onrush。由于没有自己的操作系统,这个所谓的游戏主机其实是一台 Windows 10 电脑。

11、AI 取得 Dota2 的胜利

OpenAI Five 与人类高级玩家的 Dota2 第一场比赛结束,AI 以 2:1 获胜。值得一提的是,AI 输掉的第三局是人类故意为它分配了能力较低的角色,而不是让它自己选择角色,AI 自己预估第三局的胜率只有2.9%。8月下旬将进行 AI 与职业团队的比赛。

12、一句话新闻

  • 随着手机支付的崛起,国内的 ATM 和 POS 设备制造行业大幅衰退,而二维码扫描设备制造业大幅成长。
  • Dart 语言发布2.0版。该语言的前途完全取决于谷歌的 Flutter 框架(Dart 是 Flutter 唯一支持的语言),如果谷歌大力推广这个框架,Dart 才有可能成功。
  • Android 9.0 发布,这个版本大量增加了 AI 支持,可以对每个用户提供更好的个性化服务。
  • Mozilla 浏览器计划取消 RSS 支持,原因是缺乏用户。

教程

1、如何使用 Node 优化图片?(英文)

本文教你使用 JS 的 imagemin 模块,压缩图片的大小。

2、DNS over HTTPS(英文)

DNS 查询都是基于 HTTP 协议的,即使是加密通信,网络服务商依然可以知道你想访问的网站。现在有多种解决方案,本文介绍如何在 Firefox 浏览器打开设置,使得 DNS 协议走 HTTPS 协议。

3、WebAssembly 的发展方向(英文)

WebAssembly 是浏览器可以执行的字节码,使得非 JS 编写的程序可以在浏览器运行。它现在的功能非常少,还处在 MVP(最小可用产品)状态。这篇文章介绍了一些很可能采纳的 WebAssembly 提案。

4、少子化和人口老龄化综述(英文)

全世界发达国家都有少子化和人口老龄化的趋势,这篇文章是我看过的最好的这方面的综述,有大量的数据和事实。除了非洲,其他大洲的人口都接近峰值了,将要开始下降,这意味着许多国家将不得不依靠移民,解决本国人力不足问题。

5、SVG 动画入门:以加载转子为例(英文)

本文手把手教你如何写一个最简单的 SVG 动画。

6、Hash 算法简介(英文)

Hash 算法的概念性介绍。

7、为什么飞机驾驶舱不使用触摸屏?(英文)

现在的飞机驾驶舱还是使用物理仪表盘,而不是触摸屏,这是为什么?主要的原因还是物理按钮会形成肌肉记忆,比操作触摸屏更快。

8、斐波那契数列的计算公式(英文)

如果不用递归,直接算出斐波那契数列的任意项,应该怎么计算?

9、如何通过 npm 窃取信用卡密码?(英文)

本文讲述了作者通过 npm 发布恶意代码的种种手段,非常值得一读。其中有一个技巧,就是package.js 与 package.min.js 的代码不同,恶意代码只放在后者。

工具

1、nouns

一个侦测用户眨眼的装置,让用户通过眨眼控制电脑,可以供霍金那样的用户使用。

2、模拟电路生成器

生成模拟电路的网站。

3、Zotero

管理各种论文和报告的免费桌面软件。

4、fnMatch

一个让对象进行选择性解构赋值的 JS 库。

5、jsinspect

软件项目中,同一段逻辑不应该有多个实现。jsinspect 就是用来查出重复代码的工具。

6、Malvid

一个构建 Web Components 的工具,并且能够自动生成文档。_ _

7、Glances

Glances 是一个用Python编写的跨平台系统监视工具。它自带 Web UI,可以远程监控。

8、Code with Mu

一个 Python 语言专用的代码编辑器。

9、diffconflicts

Git 的合并算法是三方合并(three-way),有人认为这种算法并不是最好的。这里是一个两方合并算法,有兴趣的人可以用它替换到 Git 的合并算法。

资源

1、机械键盘

这个网站收集各种各样的机械键盘。

2、Emulator 101

开源电子书,从零开始讲解如何写一个8080处理器的模拟器。

3、stackshare.io

收集各大网站使用的技术栈。

4、Meteor Wrongs

华盛顿大学维护的一个网站,收集各种假陨石的照片,并附上说明,讲解为什么某块石头不是陨石。

5、Byte 杂志

Byte 杂志是上个世纪很有影响的 IT 杂志,archive.org 提供免费下载。

文摘

1、幸存者偏见

二战时,英国决定要在轰炸机上加装防弹材料,减少被德国炮弹击中时的伤害。

他们按照平安返回的轰炸机被击中的位置,为其他轰炸机加装防弹材料。这种方法看上去正确,其实是错的,真正应该加载防弹材料的,恰恰是那些没有被击中的位置。因为这些轰炸机能返回,就说明它们被击中的位置不是很重要,那些被击中要害位置的轰炸机,都没有能够返回。

这就叫做幸存者偏见,人们过度重视那些幸存的个体,以为他们的特质就代表整个总体的特质。

举例来说,很多创业者总是向某些最成功的企业家学习,比如,比尔盖茨,理查德布兰森,史蒂夫乔布斯,马克扎克伯格,伊隆马斯克等等。殊不知他们可能只是特例,他们的经验不一定普遍适用。

上面这些企业家有一些共同特点。

  • 反传统,不走寻常的道路。
  • 承担巨大风险,有冒险家特质。
  • 极端主义者,他们不喜欢中间立场,有明确的爱和恨。

他们能够成功,并不表示拥有这些特点,你就能成功。很多同样拥有这些特点的人,可能都已经失败了。事实上,稳健经营更能帮助一家小公司生存下来。

2、第一条大西洋海底电缆

以下摘自奥地利作家茨威格的《人类群星闪耀时》一书。

1851年,欧洲大陆都已经连通了电报。但是,美洲一直被排除在世界电报网之外。

大西洋漫无边际,人们根本就不可能在海面上设立众多中间站,更加不可能用一根电线跨越两个大洋。人们不仅不知道海洋的深度,对它的地质结构也知之甚少,大洋底部的电缆能否承受住巨大的海水压力仍然无从得知。

即使从理论和技术上来说,铺设一条如此漫长的海底电缆行得通,但在当时还没有能负载铁铜电缆全部重量的巨轮,也没有一台发电机的功率能输送电流经过如此漫长的距离。就算轮船轻装上阵,也至少要耗费三个星期。并且在此期间,所有的电缆都必须妥善存放,不能露天放置。

英国政府提供了曾经的海战旗舰"阿伽门农"号,美国政府则"贡献"了当时吨位最大的二桅战舰"尼亚加拉"号。经过特殊的改造后,这两艘船都能容纳一半的海底电缆。

最后,同时也是最关键的问题,就是制造电缆。当时,制造一条连接两大洲的电缆需要非常精湛的技术:它既要有钢筋的坚硬而不易断裂的特性,又有非常大的柔韧性,也必须像丝线一样耐压耐磨而能随意弯曲,还必须实心而有一定的空间。总而言之,电缆必须结实、精密。对整个工程而言,电缆上任何一个微乎其微的磨损和坑洼都会破坏电流的传递。

要制造一条这样的电缆,整整需要整座橡胶林的橡胶汁。做一个形象的说明:到工程完工,电缆里使用的铜丝和铁丝长达367000海里,足足能将地球环绕13圈,甚至能将地球和月球连接起来。为此,工厂里的机器整整工作了一年。

下图是160年前第一代的大西洋海底电缆。

3、中美电影周的真相

以下摘自冯小刚的《我把青春献给你》。

中国电影周说穿了,就是由一两个美国穷人,打着热爱中国电影的旗号,从中国的制片厂免费拿到一些电影拷贝,在美国华人集中的城市转着圈的卖票放映,从中有利可图的个人行为。国内的电影制片厂也全无版权的概念,拷贝一撒手就是几个月,条件只不过是几张往返美国的机票。

"中国电影周"放映的电影院条件环境都很差,观众大部分是华人,也有少数闲着没事跟着起哄的白人,因为这些人有强烈的中国情结,看什么破片子都报以热烈的掌声,这些掌声与影片的质量无关,只能反映新老华侨的爱国热情。不明真像的导演误以为自己拍的影片多么伟大,回国后马上约记者采访,据此抱怨国内的发行公司和观众对他影片的冷落是不识货,吹嘘他的影片在美国放映引起轰动。其实也就是寄居美国的少数人自娱自乐的一个派对,主流媒体对此只字未提。

所谓的中美文化交流,说白了还是华人与华人的一次收费联谊会,跟美国大众八竿子打不着。

本周图片

1、2060年的世界人口大国

上图是2060年的世界人口预测。印度是人口最多的国家,17亿左右;其次是中国,13亿左右;后面是尼日利亚,美国,巴基斯坦,印尼,刚果,巴西,埃塞俄比亚,坦桑尼亚和墨西哥。

历史上,亚洲约占世界人口的70%,欧洲大约10%到20%,非洲5%到10%。2060年,非洲人口比重将激增,增加到世界人口的35%左右。

2、Java 的类

Java 语言以众多的类著称,但是你知道它一共有多少个 public Class 吗?

3、LED 街灯

随着 LED 产业的发展,街灯已经大量改成了节能的 LED 灯。传统的昏黄温暖的橙色灯光,变成了明亮的蓝色灯光。有人提出,蓝色的明亮灯光容易对人类造成影响,使得效率降低,失眠和焦虑增加。

本周金句

1、

每颗恒星都在不停燃烧,释放能量。如果宇宙是静止的,必然变得越来越热。幸亏宇宙在不停膨胀,把这个问题解决了。(比尔·布莱森《万物简史》)

2、

人一生最大的幸运,就是在年富力强时发现了自己的人生使命。(茨威格《人类群星闪耀时》)

3、

过去盖起的宫殿中,没有他的位置,他只好在宫殿的旁边,另起炉灶,搭起了一间偏房。问题是偏房越盖越多,越盖越大,越盖越高,渐渐成了一个院落,它就成了另一座宫殿。(刘震云《<我把青春献给你>序言》)

欢迎订阅

这个专栏每周五发布,同步更新在我的个人网站微信公众号语雀

微信搜索"阮一峰的网络日志"或者扫描二维码,即可订阅。

image | left

(完)

文档信息


          SHOPIF MONITOR DISCORD BOT / HQ TRIVIA BOT / Twitter Monitor Discord BOT      Cache   Translate Page   Web Page Cache   
Hi, I run a Discord group that is involved with hype and popular sneaker releases, as well as has a channel for the trivia show on phones, HQ Trivia. I'm looking to hire someone that can successfully get... (Budget: $8 - $15 USD, Jobs: Instagram, Javascript, PHP, Python, Software Architecture)
          JAVA/Python Developer - CCIT Consulting - Reston, VA      Cache   Translate Page   Web Page Cache   
* The candidate is expected to develop governance controls defined in EDL Data Governance Framework in AWS Cloud * 5+ years of programming experience,... $60 - $65 an hour
From Indeed - Sun, 29 Jul 2018 16:53:50 GMT - View all Reston, VA jobs
          Comment on Python 3.7: Beginner to Advanced with Web Scraping Projects by mahmud ahsan      Cache   Translate Page   Web Page Cache   
There is not many project related books but if you want to be web developer you can follow some web development related books where you will get different kind of web projects. You can search python related projects in upwork.com and create some of them if you can and add in your portfolio.
          Comment on Python 3.7: Beginner to Advanced with Web Scraping Projects by Sagar      Cache   Translate Page   Web Page Cache   
Hi Mahmud I found ur tutorial very useful..I am a beginner level programmer and very much interested in doing freelancing but don't know how to start..where I can get projects to build good portfolio? Which books would u recommend that enhance my practical knowledge and make ready for freelancing..
          Crawling website with Python      Cache   Translate Page   Web Page Cache   
I need an script, its result from three categories in a website, need to get product code, name, description and price. (Budget: $30 - $250 USD, Jobs: Python, Web Scraping)
          DialogueFlow Chatbot      Cache   Translate Page   Web Page Cache   
Build a Patients chatbot using dialogueflow. The documentation is ready to be used. (Budget: $30 - $250 USD, Jobs: C Programming, Java, PHP, Python, Software Architecture)
          Update fleetCommander to support both classic and application LoadBalancer Boto3 SDK      Cache   Translate Page   Web Page Cache   
I have an existing fleet commander script which uses boto SDK. Need to update it so it using boto3 SDK and support both classic and v2 Loadbalancer (Budget: $30 - $250 CAD, Jobs: Amazon Web Services, Aws Lambda, Python, Scripting)
          ふりがなプログミング、スラスラ読める JavaScript ふりがなプログラミング、スラスラ読める Pythonふりがなプログラミング      Cache   Translate Page   Web Page Cache   
ASAHIネット(http://asahi-net.jp )のjouwa/salonから。
---
 新宿紀伊国屋書店に、ふりがなプログミングの本があって、なんだ、これ? 例によって小学校でもプログラミング教育をやるから、そこを狙った小学生向けの本かと思ったら、そういう面でも使えるだろうが、コードにふりがなのように説明をつけてある、プログラミング初心者向けのほんとに入門の入門の本なのね。
 みんな、いろいろ、考えるね。
 最初から、2冊出ていて、ふりがなプログミングシリーズと銘打っているから、インプレスの自信のアイデア商品なのだろう。
 JavaScript版から。
--- ここから ---
プログラムが「読めない」を解決する入門書
「プログラムの読み方をすべて載せる(ふりがなをふる)」という手法で究極のやさしさを目指した、まったく新しいJavaScript(ジャバスクリプト)の入門書です。本書内に登場するプログラムの読み方をすべて載せ、さらに、漢文訓読の手法を取り入れ、読み下し文を用意。プログラムの1行1行が何を意味していて、どう動くのかが理解できます。この新しいアプローチで「プログラムが読めないから、自分がいま何をしているのかわからない」といったプログラミング入門者が途中で挫折してしまう原因を解決しました。
--- ここまで ---

 Python版から。
--- ここから ---
プログラムが「読めない」を解決する入門書
「プログラムの読み方をすべて載せる(ふりがなをふる)」という手法で究極のやさしさを目指した、まったく新しいPyhton(パイソン)の入門書です。本書内に登場するプログラムの読み方をすべて載せ、さらに、漢文訓読の手法を取り入れ、読み下し文を用意。プログラムの1行1行が何を意味していて、どう動くのかが理解できます。この新しいアプローチで「プログラムが読めないから、自分がいま何をしているのかわからない」といったプログラミング入門者が途中で挫折してしまう原因を解決しました。
--- ここまで ---

 カラーだし、イラストも入れて、手に取ってもらいやすくしてある。目次をみると、2冊で構成はほぼ同じ。好評なら、他のプログラミング言語でも、量産できる構造にしてあるのね。

https://www.amazon.co.jp/exec/obidos/ASIN/4295003859/showshotcorne-22/
スラスラ読める JavaScript ふりがなプログラミング (ふりがなプログラミングシリーズ) 単行本(ソフトカバー) – 2018/6/22
リブロワークス (著), 及川卓也 (監修)

https://www.amazon.co.jp/exec/obidos/ASIN/B07DR76HSG/showshotcorne-22/
スラスラ読める JavaScriptふりがなプログラミング Kindle版
及川卓也 (著), リブロワークス (著)

 インプレスにある紹介。
https://book.impress.co.jp/books/1117101139
スラスラ読める JavaScriptふりがなプログラミング

https://www.amazon.co.jp/exec/obidos/ASIN/4295003867/showshotcorne-22/
スラスラ読める Pythonふりがなプログラミング (ふりがなプログラミングシリーズ) 単行本(ソフトカバー) – 2018/6/22
リブロワークス (著), 株式会社ビープラウド (監修)

https://www.amazon.co.jp/exec/obidos/ASIN/B07DR9WNKJ/showshotcorne-22/
スラスラ読める Pythonふりがなプログラミング Kindle版
株式会社ビープラウド (著), リブロワークス (著)

 インプレスにある紹介。
https://book.impress.co.jp/books/1117101140
スラスラ読める Pythonふりがなプログラミング

関連:
http://iiyu.asablo.jp/blog/2017/12/07/8744337
親子で学ぶプログラミング超入門、親子で学ぶスマホとネットを安心に使う本
http://iiyu.asablo.jp/blog/2017/10/04/8694283
阿部和広「小学生からはじめるわいわいタブレットプログラミング」。これで学んで、機械の次は生命のプログラミングだ。\(^O^)/
http://iiyu.asablo.jp/blog/2017/05/18/8564084
「はじめてのプログラミング」(学研まんが入門シリーズ)。プログラミング教育、「もしものときのサバイバル術」から、小学校の時、一緒によく遊んだ工藤兄弟のことも
http://iiyu.asablo.jp/blog/2017/03/05/8392369
「なるほどわかったコンピューターとプログラミング」、「コンピュータを使わない情報教育アンプラグドコンピュータサイエンス」
http://iiyu.asablo.jp/blog/2016/05/25/8095686
小学生からはじめるわくわくプログラミング2、NHK Eテレ、厚切りジェイソンの「Why!?プログラミング」
http://iiyu.asablo.jp/blog/2015/07/16/7708582
作ることで学ぶーMakerを育てる新しい教育のメソッド、Scratchではじめよう! プログラミング入門、小学生からはじめる伝える力が身につく本-プレゼンテーション-
http://iiyu.asablo.jp/blog/2014/10/24/7472293
5才からはじめるすくすくプログラミング。なぜか、アルジャーノンに花束を
http://iiyu.asablo.jp/blog/2014/08/05/7406738
小学生から楽しむRubyプログラミング
http://iiyu.asablo.jp/blog/2014/03/12/7242927
Raspberry PIではじめるどきどきプログラミング

          Re: Will you pay for your kids' college/graduate education?      Cache   Translate Page   Web Page Cache   
Bacchus01 wrote:
Thu Aug 09, 2018 1:26 pm

Why do kids of rich parents deserve something different?


I agree in principle that children of rich and poor should be treated alike. But need-based financial aid is generally not available for children, themselves broke, whose parents are affluent, regardless of the parents' willingness to pay. Schools generally require the parental contribution even if the parents say they won't come across, because if they didn't, parents would always plead unwillingness to pay. In this particular sense, children of the rich are worse off than children of the poor at least at schools with need-based financial aid.



BTW, I add the following not to make fun of anyone in this thread, which in my view has been consistently respectful, thoughtful, and illuminating. But discussion of old college days or their alternative somehow brings to mind Monty Python's The Four Yorkshiremen.
          Hands-On Data Analysis with NumPy and Pandas      Cache   Translate Page   Web Page Cache   

eBook Details: Paperback: 168 pages Publisher: WOW! eBook (June 29, 2018) Language: English ISBN-10: 1789530792 ISBN-13: 978-1789530797 eBook Description: Hands-On Data Analysis with NumPy and Pandas: Get to grips with the most popular Python packages that make Data Analysis possible and implement Python packages from data manipulation to processing

The post Hands-On Data Analysis with NumPy and Pandas appeared first on eBookee: Free eBooks Download.


          #10: Learning Python: Powerful Object-Oriented Programming      Cache   Translate Page   Web Page Cache   
Learning Python
Learning Python: Powerful Object-Oriented Programming
Mark Lutz
(27)

Buy new: CDN$ 85.80 CDN$ 50.26
39 used & new from CDN$ 50.26

(Visit the Bestsellers in Programming list for authoritative information on this product's current rank.)
          Newbie - How to create footprint for circular rotary?      Cache   Translate Page   Web Page Cache   

Here is a screenshot of what I have for new from wizard:
2018-08-09%2012_25_57-Footprint%20Editor%20%E2%80%94%20no%20active%20library
Note, this doesn’t show in the menus.

Here is the list of footprint wizards available to me on my install:
2018-08-09%2013_39_23-Footprint%20Generators

And for the OP here is what the circular wizard looks like:

Now for the magic question… What version am I running? I found my install package in a testing folder (so it isn’t even a nightly) for testing enabling the action menu compiler flag. Here is my version info (the platform is actually Win10 but that is a known issue with Windows):

Application: kicad
Version: (5.0.0), release build
Libraries:
wxWidgets 3.0.3
libcurl/7.54.1 OpenSSL/1.0.2l zlib/1.2.11 libssh2/1.8.0 nghttp2/1.23.1 librtmp/2.3
Platform: Windows 8 (build 9200), 64-bit edition, 64 bit, Little endian, wxMSW
Build Info:
wxWidgets: 3.0.3 (wchar_t,wx containers,compatible with 2.8)
Boost: 1.60.0
OpenCASCADE Community Edition: 6.8.0
Curl: 7.54.1
Compiler: GCC 7.1.0 with C++ ABI 1011

Build settings:
USE_WX_GRAPHICS_CONTEXT=OFF
USE_WX_OVERLAY=OFF
KICAD_SCRIPTING=ON
KICAD_SCRIPTING_MODULES=ON
KICAD_SCRIPTING_WXPYTHON=ON
KICAD_SCRIPTING_ACTION_MENU=ON
BUILD_GITHUB_PLUGIN=ON
KICAD_USE_OCE=ON
KICAD_USE_OCC=OFF
KICAD_SPICE=ON


          CLI invocation if KiCAD commands      Cache   Translate Page   Web Page Cache   

Like @Rene_Poschl said, I I have had good results with svg2mod.

The mod file format is described quite nicely here https://www.compuphase.com/electronics/LibraryFileFormats.pdf (I am sure there is a proper Kicad reference but this is one I have been referring to recently. The new format is s-expression which is a based on arcs/lines/circles and polygons so similar to svg. Have a look at both in a text editor.

There is no Python abstraction to bitmap2component nor any plans for one AFAIK. However, I would have thought that you could cobble together a workflow.

Try something like this - a little Bash script - takes all bmp in a directory, converts to same named svg and then to footprints?

for f in *.bmp; do
    n=${f%.bmp}
    potrace  $f -s -o $n.svg
    svg2mod --input-file $n.svg --output-file $n
done

Enjoy!


          Database Administrator Junior (W2, Mountain View) - cPrime, Inc. - Mountain View, CA      Cache   Translate Page   Web Page Cache   
Mountain View (W2) Top 2-3 skills you look for when reviewing resume- Oracle experience (3-5 YOE) AWS/RWS is a plus Demonstrated Python experience 3-5 YOE...
From Dice - Sat, 21 Jul 2018 02:03:31 GMT - View all Mountain View, CA jobs
          CLI invocation if KiCAD commands      Cache   Translate Page   Web Page Cache   

Oh, great news on the plotting capabilities and integration with Python!

As for bitmap2component, I have been using PNGs actually, but it’s always worked upon import. Is there a python abstraction layer to b2c by chance? I’ve tried several of the converters from SVG and always got garbled library files.

What exactly is b2c converting images to? Is that some standard kind of format that I could create in another way?


          Kicad StepUp - recomend Materials?      Cache   Translate Page   Web Page Cache   

Here the doc about Materials:


Wrl materials doc

I think it would be possible to add these materials to FC materials… I don’t know if materials can be loaded/saved in FC
About StepUp the list is coded inside the main python file. Just search for materials inside.
@kammutierspule
Would it be possible to have the doc in libre office format, for future additions? Thks


          CLI invocation if KiCAD commands      Cache   Translate Page   Web Page Cache   

There is no direct CLI invocation in Kicad to do this sort of thing but PCBNew is scriptable and has a Python abstraction layer. This is in a state of flux at present and scripting is not available on all platforms/versions. However, in macOS, you are in luck - both in 4.0.7 and 5.0.1 support python scripting. There is still a bit of finangelling required to get this to work.

There is some useful material about Python scripting here (but bear in mind the comments about the current ongoing changes).

The following script will work on macOS and will generate gerbers. Save the script somewhere in your path or run it in situ. Change the path where you want the output directory (line 11) Invoke it from the command line with the path to a *.kicad_pcb file e.g.

./plotter.py ~/EDA_Workspace/Projects/Thermocouple_datalogger/Ver_5V/ThermocoupleLogger.kicad_pcb

Script;

#!/Applications/Kicad/kicad.app/Contents/Applications/pcbnew.app/Contents/MacOS/Python

import sys
import os

sys.path.insert(0, "/Applications/Kicad/kicad.app/Contents/Frameworks/python/site-packages/")
import pcbnew
from pcbnew import *

file_name = sys.argv[1]
output_dir = '~/Desktop/gerbers'

try:
os.makedirs(output_dir)
except OSError:
pass

board = pcbnew.LoadBoard(file_name)
pctl = pcbnew.PLOT_CONTROLLER(board)
popt = pctl.GetPlotOptions()
popt.SetOutputDirectory('~/Desktop/gerbers')
popt.SetPlotFrameRef(False)
popt.SetLineWidth(pcbnew.FromMM(0.1))

popt.SetAutoScale(False)
popt.SetScale(1)
popt.SetMirror(False)

popt.SetUseGerberAttributes(True)
popt.SetUseGerberProtelExtensions(True)

popt.SetExcludeEdgeLayer(True)
popt.SetUseAuxOrigin(False)
pctl.SetColorMode(True)

popt.SetSubtractMaskFromSilk(False)
popt.SetPlotReference(True)
popt.SetPlotValue(False)

layers = [
("F.Cu", pcbnew.F_Cu, "Top layer"),
("B.Cu", pcbnew.B_Cu, "Bottom layer"),
("F.Paste", pcbnew.F_Paste, "Paste top"),
("B.Paste", pcbnew.B_Paste, "Paste bottom"),
("F.SilkS", pcbnew.F_SilkS, "Silk top"),
("B.SilkS", pcbnew.B_SilkS, "Silk top"),
("F.Mask", pcbnew.F_Mask, "Mask top"),
("B.Mask", pcbnew.B_Mask, "Mask bottom"),
("Edge.Cuts", pcbnew.Edge_Cuts, "Edges"),
]

for layer_info in layers:
pctl.SetLayer(layer_info[1])
pctl.OpenPlotfile(layer_info[0], pcbnew.PLOT_FORMAT_GERBER, layer_info[2])
pctl.PlotLayer()

pctl.ClosePlot()

bitmap2component will only (AFAIK) take bitmap (bmp) images and convert them. You would have to convert from jpg to BMP first. The library format converts to polylines. I think you could probably do this with Inkscape but haven’t tried.

EDIT Further thought - you could try using a workflow based on bmp -> svg using the excellent potrace utility [http://potrace.sourceforge.net] and then converting your svg to a footprint using svg2mod script https://github.com/mtl/svg2mod.

It should be easy enough to tie those together with a bit of bash fu if you have a lot of bitmaps to do.


          Programming Assignment Help Online UK -Myassignmenthelp.com      Cache   Translate Page   Web Page Cache   
Programming Assignment Help is the online programming help service provided to the students in UK by expert coders. We provide Programming Homework help service in Java, Python, C, C#, Php, C++, Databases, SQL, HTML, Android or iOS. Get Coding help from the best programming assignment help website in UK. Quick & Best computer science programming help service provider. https://myassignmenthelp.com/uk/programming-language-assignment-help.html tel:+44-121-285-4112
          python精品量化投资课程(1)      Cache   Translate Page   Web Page Cache   
这天的市场不是很好,要想办法办法把业绩做起了,没有差市场,只有差的心态。 冰火两重天的日子不是很好过,但是也很刺激。 怀念稳稳的幸福的日子。 分享一份python量化投资资料
          python script fix      Cache   Translate Page   Web Page Cache   
fix python script bug. script does not go to next page to scrape. (Budget: $10 - $30 CAD, Jobs: Python, Web Scraping)
          Re: UK vs USA for life      Cache   Translate Page   Web Page Cache   

well there's his next topic rite there. should he date drunk American girls or move to England and learn to program in monty python?


          Développeur Python - Alteo Recrutement Informatique - Montréal, QC      Cache   Translate Page   Web Page Cache   
Alteo est à la recherche d'un Développeur Python pour un emploi permanent basé à Montréal (centre-ville). DEC / BAC en informatique, ingénierie logicielle ou l...
From Alteo Recrutement Informatique - Tue, 07 Aug 2018 00:48:42 GMT - View all Montréal, QC jobs
          Proofpoint launches TAP Isolation, threat detection, and Threat Response python scripting      Cache   Translate Page   Web Page Cache   

Proofpoint announced three people-centric security innovations, including Targeted Attack Protection (TAP) Isolation for personal webmail and browsing defense, threat detection, and new Threat Response python scripting. “Exceptional effectiveness in threat protection requires constant innovation—and today Proofpoint has increased its lead,” said Ryan Kalember, senior vice president of Cybersecurity Strategy for Proofpoint. “Technologically, these people-centric innovations are an important step forward for Proofpoint and our customers. We help security teams proactively defend their organizations from today’s … More

The post Proofpoint launches TAP Isolation, threat detection, and Threat Response python scripting appeared first on Help Net Security.


          Python Development Internship - Fairfest Media Limited - Kolkata, West Bengal      Cache   Translate Page   Web Page Cache   
Established in 1989, Fairfest organizes the reputed TTF and OTM international travel fairs, as well as Municipalika, a leading event on municipal management and... ₹10,000 - ₹15,000 a month
From Internshala - Fri, 03 Aug 2018 02:25:21 GMT - View all Kolkata, West Bengal jobs
          Cloud Solution Architect - Microsoft - Philadelphia, PA      Cache   Translate Page   Web Page Cache   
Machine Learning (SAS, R, Python). Problem-solving mentality leveraging internal and/or external resources....
From Microsoft - Tue, 17 Apr 2018 18:34:17 GMT - View all Philadelphia, PA jobs
          Senior Data Analyst - William E. Wecker Associates, Inc. - Jackson, WY      Cache   Translate Page   Web Page Cache   
Experience in data analysis and strong computer skills (we use SAS, Stata, R and S-Plus, Python, Perl, Mathematica, and other scientific packages, and standard...
From William E. Wecker Associates, Inc. - Sat, 23 Jun 2018 06:13:20 GMT - View all Jackson, WY jobs
          Distilled News      Cache   Translate Page   Web Page Cache   
Ultimate guide to handle Big Datasets for Machine Learning using Dask (in Python) Have you ever tried working with a …

Continue reading


          C++ Python Software Developers (PySide2 Qt Mission Systems)      Cache   Translate Page   Web Page Cache   
VA-Dulles, C+ Python Software Developers (PySide2 Qt Mission Systems) Aerospace Dulles, VA (6 mos open contract) *OOP C+ Python PySide2 GUI Graphic User Interface Qt JavaScript JQuery Angular HTML5 CSS3 Node.js REST RESTful APIs Make CMake Visual Studio OpenSceneGraph SQL DoD Aerospace Spacecraft Ground Systems Mission Systems UAVs* Please send your updated resume to: Bob.Russ@InSourceTechnical.com ASAP befo
          Take a SWIG out of the Gesture Recognition Toolkit (GRT)      Cache   Translate Page   Web Page Cache   
A Python binding for the Gesture Recognition Toolkit (GRT) using SWIG.
          Python API Developer      Cache   Translate Page   Web Page Cache   
NY-NEW YORK CITY, A leading global investment firm, managing a wide range of investment funds worldwide across multiple asset classes is seeking a Python API Developer to join their team in New York. This firm encourages an entrepreneurial spirit and for people to strive for success. You will be empowered by top leaders in the industry to do your best work and propel your career forward. This candidate will join a
          CUTTING SNAKE EGGS AND FOUND A DEAD SNAKE :( | BRIAN BARCZYK      Cache   Translate Page   Web Page Cache   

I'm cutting some new Ball python clutches and found a cool dual-fathered clutch as well as a beautiful baby snake that unfortunately didn't make it.
SUBSCRIBE TO BRIAN BARCZYK ▶
Watch More Surprise! Snake Eggs, Gecko Eggs, Reptile Eggs, Hatching Eggs and Stealing Eggs | BRIAN BARCZYK:

BE APART OF BUILDING THE REPTARIUM - MY PATREON PAGE:
MASTER SOCIAL MEDIA - VIRAL DUNK E-COURSE:

SUBSCRIBE TO MY VLOG CHANNEL ▶

WATCH MY VLOGS!
2018 VLOGS ▶
2017 VLOGS ▶
2016 VLOGS ▶

BEST OF THE BEST PLAYLIST!! WATCH NOW ▶

***********I LOVE TO HEAR FROM YOU***********
FAN MAIL ▶ P.O. Box 182306 Shelby Township, MI 48318
BADCHOICENOAH ▶
↑↑↑↑↑↑FOLLOW NOAH↑↑↑↑↑↑

***********BUY MERCH***********
T-SHIRTS & THERMALS:
HATS & BEANIES:
BRIAN IN THE WILD DVD & BLU-RAY:
↑↑↑↑↑↑CHECK OUT MY DOCUMENTARY↑↑↑↑↑↑

★ FOLLOW ME ON SOCIAL MEDIA ★
Facebook ▶
Twitter ▶
Instagram ▶

★ FOLLOW BHB REPTILES ON SOCIAL MEDIA ★
Instagram ▶
Twitter ▶
Facebook ▶
Website ▶

★ FOLLOW REPTILE PRIME ON SOCIAL MEDIA ★
Facebook ▶
Instagram ▶
Website ▶

About Brian Barczyk:
Hey, I'm Brian Barczyk from SnakeBytesTV, AnimalBytesTV and Discovery channel's series Venom Hunters. Follow the Barczyk family and I as we share our lives as reptile breeders. We post daily vlogs each day at 9:00 AM (EST)! We are reptile breeders of snakes, geckos, blue-tongued skinks as well as all other animals. I also travel around the world doing amazing animal adventures. As far as snakes I breed corn snakes, milk snakes, king snakes, rat snakes, Ball pythons, Sand boas, Woma Pythons, Spotted Pythons, Children's pythons, Carpet pythons and more! Besides breeding, I have two dogs, Burmese pythons, Reticulated pythons, Tortoises, Lizards and an American alligator. Thank you for watching and don’t forget to check out my vlogs to go around the world with me!!

CUTTING SNAKE EGGS AND FOUND A DEAD SNAKE :( | BRIAN BARCZYK


Brian Barczyk


MUSIC: LOFI HIP HOP - DAYS LIKE THESE

Author: avatarpetnewsuk7721
Tags: brian barczyk snake ball python eggs bhb reptiles snake feeding cutting snake eggs found a dead snake snake eggs cutting eggs
Posted: 10 August 2018


          Around The World In One Hour      Cache   Translate Page   Web Page Cache   
Video: Around The World In One Hour
Watch This Video!
Studio: Global Video Pro
WORLD MONTAGE, One minute of various images from around the world. DEAD SEA, ISRAEL, Float on the Dead Sea. Eight times more salt than the ocean. Visitors from worldwide come to seek wellness from the water and healing black mud.
SNAKE CHARMER OF MALAYSIA, A dying breed, these snake charmers risk their lives to entertain audiences. frequently bitten by cobras and pit vipers, they still play a dangerous game! Have you had a 22 foot long python coiled around your body lately???
DIVE PHILIPPINES, the Philippines is known for its spectacular dive sites. Explore the beautiful undersea world around Cebu Island, teeming with a vast array of exotic sea creatures, caves and cliffs.... LAS VEGAS PREVIEW, tour of Las Vegas, aerials, casinos and Hoover Dam, etc.
HAWAII KAYAK ADVENTURE, paddle through the Big Islands ten tunnels high in the Kohala Mountains, by kayak. Some tunnels one mile long. The ultimate eco-tourism adventure!
ELEPHANT SHOW, THAILAND, see elephants perform amazing feats in Phuket, Thailand. Dancing, playing music, tricks, headstands, playing soccer and carrying boy with his head in the elephant's mouth. Daring stuff!

          [آموزش] دانلود Udemy Becoming a Software Tester - آموزش تبدیل شدن به یک تست کننده نرم افزار      Cache   Translate Page   Web Page Cache   

دانلود Udemy Becoming a Software Tester - آموزش تبدیل شدن به یک تست کننده نرم افزار#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

آزمون نرم‌افزار به فرایند ارزیابی نرم‌افزار به منظور اطمینان از عملکرد صحیح آن در رویدادهایی مختلفی که ممکن است در دوره استفاده از نرم‌افزار با آن مواجه شود می‌باشد و به عبارت دیگر پیدا کردن خطاهایی احتمالی یک نرم‌افزار برای عملکرد درست، صحیح و بهینه آن در طول استفاده از آن است. هر چقدر نرم‌افزار بتواند با رویدادها مختلف به صورت مطلوب تر و قابل پذیرش تری چه از نظر عملکرد و چه از راحتی کاربر داشته باشد می‌توان انتظار داشت نرم‌افزار دارای عملکرد بهتری می‌باشد. ر سالهای اخیر آمارهای شگفت‌آوری از سوی مؤسسه (NIST(National Institute of Standards and ...


http://p30download.com/81102

مطالب مرتبط:



دسته بندی: دانلود » آموزش » برنامه نویسی و طراحی وب
برچسب ها: , , , , , , , , , , , , ,
لینک های مفید: خرید کارت شارژ, شارژ مستقیم, پرداخت قبض, خرید آنتی ویروس, خرید لایسنس آنتی ویروس, تبلیغات در اینترنت, تبلیغات اینترنتی
© حق مطلب و تصویر برای پی سی دانلود محفوظ است همین حالا مشترک این پایگاه شوید!
لینک دانلود: http://p30download.com/fa/entry/81102


          [آموزش] دانلود Packt Application Development with Spring 5.0 and Angular 6 - آموزش توسعه اپلیکیشن با اسپرینگ 5 و آنگولار 6      Cache   Translate Page   Web Page Cache   

دانلود Packt Application Development with Spring 5.0 and Angular 6 - آموزش توسعه اپلیکیشن با اسپرینگ 5 و آنگولار 6#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

اسپرینگ (Spring) یکی از معروف ترین و پرکاربرد ترین framework های جاوایی است. این framework متن باز بوده و اولین بار در سال 2003 ارائه شده است. اسپرینگ از ابتدای ارائه تا الان تغییرات زیادی داشته و بسیار قدرتمند و معروف شده است. اسپرینگ framework قدرتمندی است که هدف اصلی آن برنامه های enterprise هستند اگرچه اسپرینگ بیشتر java EE را هدف گرفته است ولی در محیط های غیر از EE نیز می توان از آن استفاده کرد. بر خلاف تصوری که اشتباهاً در بین خیلی ها شکل گرفته است اسپرینگ برای ساخت اپلیکیشن تحت وب نیست بلکه Spring ...


http://p30download.com/81261

مطالب مرتبط:



دسته بندی: دانلود » آموزش » برنامه نویسی و طراحی وب
برچسب ها: , , , , , , , , , , , , ,
لینک های مفید: خرید کارت شارژ, شارژ مستقیم, پرداخت قبض, خرید آنتی ویروس, خرید لایسنس آنتی ویروس, تبلیغات در اینترنت, تبلیغات اینترنتی
© حق مطلب و تصویر برای پی سی دانلود محفوظ است همین حالا مشترک این پایگاه شوید!
لینک دانلود: http://p30download.com/fa/entry/81261


          Python XML, JSON, and the Web      Cache   Translate Page   Web Page Cache   
Python XML, JSON, and the Web

Python: XML, JSON, and the Web
MP4 | Video: 720p | Duration: 1:38:54 | English | Subtitles: VTT | 278.4 MB

          Help me access Rest Response from Etrade API Java OR Functioning Access to Etrade API with Python      Cache   Translate Page   Web Page Cache   
I am building a Java Application that utilizes the Etrade API. I have set up the sandbox environment and have successfully retrieved a Rest response. I need someone to help me access the individual elements... (Budget: $30 - $250 USD, Jobs: Java, MySQL, RESTful, Software Architecture)
          Artificial Intelligence and Machine Learning      Cache   Translate Page   Web Page Cache   
Participants will learn to develop artificial intelligence (AI) applications to address real-world business problems using tools such as Python, ...
          Python Developer      Cache   Translate Page   Web Page Cache   
NY-New York, Python Developer Grab the opportunity to achieve your full potential! Eclaro is looking for a Python Developer for our client in New York, NY. Eclaro’s client is one of the world's largest financial institutions, committed to providing the tools and services that bridge the gap between customers and their goals. If you’re up to the challenge, then take a chance at this rewarding opportunity! Respo
          一个基于flask的博客程序-Simpleblog      Cache   Translate Page   Web Page Cache   
A simple blog app by flask. 

简介

这是一个使用 Python  Flask 框架,模仿简书, 写的一个简单的,适合多人使用的社交型网站。

地址

功能

  • 注册,登录
  • 发布文章
  • 文章点赞
  • 发布,回复,管理评论
  • 设置资料
  • 关注用户
  • 消息通知和私信
  • 管理员功能

本地使用

安装需要的库
$ pip install -r requirements.txt
更新数据库,获得角色权限
$ manage.py deploy
运行
$ manage.py runserver --host 0.0.0.0
打开本地浏览器访问127.0.0.1:5000即可。

Heroku部署

1.注册 Heroku 账户
2.安装Heroku Toolbelt,登录Heroku。
$ heroku login
Email: <youremail>
Password: <password>
3.创建app,这里我们要创建没有被注册过得app name.
$ heroku create <your appname>
4.配置数据库
$ heroku addons:add heroku-postgresql:hobby-dev
5.设置自己的环境变量,例如设置管理员邮件地址
$ heroku config:set ADMINEMAIL=<adminemail>
6.如果要使用Heroku部署,必须确保程序托管在Git仓库。如果已经确定所有程序都已经提交到Git仓库,需要把程序上传到远程仓库heroku。
$ git push heroku master
7.执行deploy命令
$ heroku run python manage.py deploy
$ heroku restart
执行成功,访问http://<youapp>.herokuapp.com/
8.如果程序运行中,发现bug需要改动。直接重复以上步骤,然后执行升级命令:
$ heroku maintenance:on
$ git push heroku master
$ heroku run python manage.py deploy
$ heroku restart
$ heroku maintenance:off

阿里云部署

觉得免费的Heroku太慢,并且有自己的服务器,想要部署线上。以阿里云部署为参考,利用Flask+WSGI+Nginx部署,亲测每页响应速度不到0.3s。
具体部署参考大神文章:文章地址
from https://github.com/Blackyukun/Simpleblog

          博客程序:python-webapp      Cache   Translate Page   Web Page Cache   


python license
这是Python教程 - 廖雪峰的官方网站中的一个博客实战项目,供学习使用。

项目结构

python-webapp/           <-- 根目录
|
+- backup/ <-- 备份目录
|
+- conf/ <-- 配置文件
|
+- dist/ <-- 打包目录
|
+- www/ <-- Web目录,存放.py文件
| |
| +- static/ <-- 存放静态文件
| |
| +- templates/ <-- 存放模板文件
|
+- ios/ <-- 存放iOS App工程
|
+- LICENSE <-- LICENSE

运行

本地预览(请保证已安装MySQL --> Mac安装教程):
$ git clone https://github.com/WeiLeiming/python-webapp.git
$ cd python-webapp
$ pip3 install -r requirements.txt
$ cd www
$ mysql -u root -p < schema.sql
$ chmod +x pymonitor.py
$ ./pymonitor.py app.py
浏览器访问http://localhost:9000/

开发环境

  • Python 3.6.2
  • MySQL Community Server 5.7.19
  • 第三方库
    • aiohttp - Async http client/server framework (asyncio)
    • jinja2 - a template engine written in pure Python
    • aiomysql - aiomysql is a library for accessing a MySQL database from the asyncio
    • uikit — A lightweight and modular front-end framework for developing fast and powerful web interfaces
    • Vue.js — A progressive, incrementally-adoptable JavaScript framework for building UI on the web.

开发工具

总结

用户浏览页面:

  • 首页:GET /
  • 注册页:GET /register
  • 登录页:GET /signin
  • 日志详情页:GET /blog/{id}

管理页面:

  • 评论列表页:GET /manage/comments
  • 日志列表页:GET /manage/blogs
  • 用户列表页:GET /manage/users
  • 创建日志:GET /manage/blogs/create
  • 修改日志:GET /manage/blogs/edit

后台API:

  • 注册用户:POST /api/users
  • 验证用户:POST /api/authenticate
  • 获取用户:GET /api/users
  • 退出用户:GET /signout
  • 创建日志:POST /api/blogs
  • 获取详情日志:GET /api/blogs/{id}
  • 获取日志:GET /api/blogs
  • 修改日志:POST /api/blogs/{id}
  • 删除日志:POST /api/blogs/{id}/delete
  • 创建评论:POST /api/blogs/{id}/comments
  • 获取评论:GET /api/comments
  • 删除评论:POST /api/comments/{id}/delete

参考


from  https://github.com/WeiLeiming/python-webapp

          Massive python slithers up Australian man's home in shocking video      Cache   Translate Page   Web Page Cache   
A homeowner in Australia captured an unwanted surprised on camera, as a nearly 10-foot long python slithered its way up the side of his home.
          Python Developer      Cache   Translate Page   Web Page Cache   
IL-Chicago, Chicago, Illinois Skills : Python, C+, Java, Development Description : • Development & design of application. • Create functional & technical documentation in line with the bank’s standards • Work with global teams • 5+ years of Financial industry IT experience in software development • 5+ years in computer programming using Python, Java, C+ • Writing server-side web application and develop back-e
          need help on python project for my site      Cache   Translate Page   Web Page Cache   
need help on python project for my site (Budget: ₹600 - ₹1500 INR, Jobs: Django, Python)
          PyKat 1.1.277      Cache   Translate Page   Web Page Cache   
Python interface and tools for FINESSE
          pulumi 0.15.1.dev1533867000      Cache   Translate Page   Web Page Cache   
Pulumi's Python SDK
          deprecation-factory 0.1.1      Cache   Translate Page   Web Page Cache   
Python deprecation factory ensuring useful warnings and docstrings for different deprecations.
          spotify 0.1.4      Cache   Translate Page   Web Page Cache   
spotify.py is an asynchronous API wrapper for Spotify written in Python.
          pymatgen 2018.8.10      Cache   Translate Page   Web Page Cache   
Python Materials Genomics is a robust materials analysis code that defines core object representations for structures and molecules with support for many electronic structure codes. It is currently the core analysis code powering the Materials Project (https://www.materialsproject.org).
          回归树的原理及其 Python 实现      Cache   Translate Page   Web Page Cache   

我们用人话而不是大段的数学公式,来讲讲回归树是怎么一回事。

回归树的原理及其 Python 实现,首发于文章 - 伯乐在线


          haro 2018.8.9      Cache   Translate Page   Web Page Cache   
Haro.ai Python Library
          python-espncricinfo 0.4.1      Cache   Translate Page   Web Page Cache   
ESPNCricInfo API client
          mrworkserver 0.3      Cache   Translate Page   Web Page Cache   
A python work server written in C
          mrworkserver 0.2      Cache   Translate Page   Web Page Cache   
A python work server written in C
          Senior Full Stack Developer - UpTop - Seattle, WA      Cache   Translate Page   Web Page Cache   
C/C++, C#, Coldfusion, GO, Java, JavaScript, PHP, Python, Objective-C, Perl, Ruby, Scala, or Swift. The breadth of work spans from legacy hosting of projects...
From Indeed - Mon, 16 Apr 2018 18:56:24 GMT - View all Seattle, WA jobs
          Cyber Engineer - Foreground Security - Dulles, VA      Cache   Translate Page   Web Page Cache   
Java, Swing, Hibernate, Struts, JUnit, Perl, Ruby, python, HTML, C, C++, .NET, ColdFusion, Adobe, Assembly language, etc. Raytheon is supporting a U.S....
From Foreground Security - Wed, 01 Aug 2018 20:00:45 GMT - View all Dulles, VA jobs
          Cyber Engineer - TS/SCI Required - Talent Savant - Dulles, VA      Cache   Translate Page   Web Page Cache   
Java, Swing, Hibernate, Struts, JUnit, Perl, Ruby, Python, HTML, C, C++, .NET, ColdFusion, Adobe, Assembly language, etc....
From Talent Savant - Fri, 27 Jul 2018 06:03:39 GMT - View all Dulles, VA jobs
          Senior Cyber Engineer - TS/SCI Required - Talent Savant - Dulles, VA      Cache   Translate Page   Web Page Cache   
Java, Swing, Hibernate, Struts, JUnit, Perl, Ruby, Python, HTML, C, C++, .NET, ColdFusion, Adobe, Assembly language, etc. Senior Cyber Engineer....
From Talent Savant - Fri, 27 Jul 2018 05:57:09 GMT - View all Dulles, VA jobs
          Cyber Engineer - IntellecTechs, Inc. - Dulles, VA      Cache   Translate Page   Web Page Cache   
Java, Swing, Hibernate, Struts, JUnit, Perl, Ruby, Python, HTML, C, C++, .NET, ColdFusion, Adobe, Assembly language, etc....
From Indeed - Tue, 15 May 2018 17:16:54 GMT - View all Dulles, VA jobs
          Sr. Cyber Engineer - Raytheon - Dulles, VA      Cache   Translate Page   Web Page Cache   
Java, Swing, Hibernate, Struts, JUnit, Perl, Ruby, python, HTML, C, C++, .NET, ColdFusion, Adobe, Assembly language, etc. Raytheon is supporting a U.S....
From Raytheon - Thu, 19 Apr 2018 12:33:21 GMT - View all Dulles, VA jobs
          Cyber Engineer - Raytheon - Dulles, VA      Cache   Translate Page   Web Page Cache   
Java, Swing, Hibernate, Struts, JUnit, Perl, Ruby, python, HTML, C, C++, .NET, ColdFusion, Adobe, Assembly language, etc. Raytheon is supporting a U.S....
From Raytheon - Thu, 19 Apr 2018 12:33:20 GMT - View all Dulles, VA jobs
          Cyber Security Engineer - ProSOL Associates - Sterling, VA      Cache   Translate Page   Web Page Cache   
Java, Swing, Hibernate, Struts, JUnit, Perl, Ruby, python, HTML, C, C++, .NET, ColdFusion, Adobe, Assembly language, etc. ProSol is supporting a U.S....
From ProSOL Associates - Thu, 09 Aug 2018 03:12:29 GMT - View all Sterling, VA jobs
          Database Administrator Junior (W2, Mountain View) - cPrime, Inc. - Mountain View, CA      Cache   Translate Page   Web Page Cache   
Mountain View (W2) Top 2-3 skills you look for when reviewing resume- Oracle experience (3-5 YOE) AWS/RWS is a plus Demonstrated Python experience 3-5 YOE...
From Dice - Sat, 21 Jul 2018 02:03:31 GMT - View all Mountain View, CA jobs
          Celebration Presents THE INCREDIBLE LIFE AND WONDROUS ADVENTURES OF THE AMAZING FABULOUS FRED      Cache   Translate Page   Web Page Cache   

Celebration Theatre presents, in association with The Aliiance of Los Angeles Playwrights, as part of its New Works Reading Series, The Incredible Life and Wondrous Adventures of the Amazing Fabulous Fredby Dan Berkowitz, directed by Alan Wethern and performing one night only on Tuesday, August 14 at 7:30pm at Celebration Theatre @ The Lex, 6760 Lexington Ave. in Hollywood.

THE INCREDIBLY LIFE AND WONDROUS ADVENTURES OF THE AMAZING FABULOUS FRED follows Fred, a gay insurance salesman and world-class hypochondriac. But under his nerdy exterior beats the heart of a fabulist, with fantasies of flying secretly to England to meet his real mother the Queen... joining a Mexican cult which celebrates vegetables... and cross-dressing so no one notices by wearing his evening gown under his suit.

When Fred is diagnosed with a disease so rare only other hypochondriacs have heard of it, he must confront a surreal health care system, skeptical friends, and lovers. After an excruciating death, Fred returns from The Other Side to let us know that love is what really drives the universe. It's comedy, it's drama, it's magical realism, it's vaudeville... and a little Monty Python too...

ABOUT THE CREATIVE TEAM AND CAST

Dan Berkowitz (Playwright) is a member of the Council of The Dramatists Guild of America, the professional association of playwrights, composers, and lyricists; and Co-Chair of The Alliance of Los Angeles Playwrights (ALAP). His writing for the stage has been produced off-Broadway, at major regionals, and in other venues across the US and Canada, and includes the popular revue A... My Name Is Still Alice, and There's No Place Like Hollywood!, nominated for LA's Ovation Award for Best Musical. Dan is a graduate of Princeton, did grad work in theatre at Berkeley, studied with famed acting teacher Stella Adler, and also produces, directs, and is a script and production consultant. His non-theatrical activities over the years have included serving as Co-Chair of the City of West Hollywood's Lesbian and Gay Advisory Board and President of The Tom of Finland Foundation. http://danberkowitz.com.

ALAN WETHERN (Director) Alan Wethern, a company member of Celebration Theatre, is proud to direct this reading for the stage, after spending the last few years producing both reality tv and indie films, alike. An avid theatre-goer, he enjoys staying active, volunteering, and finding time for a good book whenever possible. His recent production credits include: Cabaret, Priscilla, Queen of the Desert, Damaged Furniture, and W.P.G. The Oklahoma native is a graduate of OCU's MBA program, as well as The Los Angeles Film School immersion program.

NATHAN FRIZZELL (Producer) Since 2013, Nathan has been the Executive Producer of Celebrating New Works as Literary Director of Celebration Theatre. During that time, Celebration has aimed to give voice to new works by over 40 LGBTQ and Allied artists. As part of Celebration Theatre, Nathan has appeared onstage and produced numerous mainstage productions including Booty Candy, Die Mommie Die!, Priscilla, Queen of the Desert and the Ovation Award-winning West Coast premiere of The Boy From Oz. Last year, Nathan- along with Jay Marcus and Tom Detrinis- developed the world premieres of two smash hits So Long, Boulder Cityand Tilda Swinton Answers An Ad on Craigslist which sold out lengthy runs in Los Angeles before going Off-Broadway in New York. An actor, writer and producer, Nathan is proud to have been a Celebration Theatre company member since 2005. http://www.nathanfrizzell.com

The cast of THE INCREDIBLE LIFE AND WONDROUS ADVENTURES OF THE AMAZING FABULOUS FRED will feature: John Ireland, Peggy Etra, and Jamie Pierce. Isabella Way is Co-Producer.

PRICING Admission is free... with donations gratefully accepted. Celebrating New Works is produced by Celebration Theatre's Literary Director Nathan Frizzell. Celebration Theatre is headed by its Co-Artistic Directors Michael Matthews and Michael A. Shepperd, along with its Executive Director Michael C. Kricfalusi. This reading is produced in association with The Alliance of Los Angeles Playwrights. Please visit www.laplaywrights.orgfor more information.


          Kommentarer till SVT:s faktakoll är ett skämt värdigt Monty Python! av mansplaining      Cache   Translate Page   Web Page Cache   
Trodde kvinnor hade högre smärttröskel än män, i alla fall under vissa delar av menstruationscykeln. Köper vad du säger, i synnerhet om kvinnor i krig. Ah, och glöm inte testosteronskillnaden som när den kickar in som en turbo ger en extra växel och energi som kvinnor helt enkelt inte har. Även i fall när den fysiska styrkan är likvärdig så finns det hos mannen ännu en ytterligare växel. Exakt, med en kvinna i gruppen blir männen distraherade och risken att gruppen blir överfallen och dödade i krig ökar avsevärt. Kvinnan i gruppen riskerar bli tillfångatagen och gruppvåldtagen av fienden. Det kan försvarsmakten trycka in i sin nästa genuspamflett. Två saker ska man undvika om man inte vill ha gruppvåldtagna kvinnor. Det ena är att inte skicka ut dem i krig. Det andra är att inte importera våldtäktsmän. Feminister måste verkligen hata kvinnor som är beredd att offra kvinnor på alla fronter. Nej håller med, det går liksom inte ihop att kvinnor och barn skickas ut i krig så att männen kan sitta på sofflocket och knapra praliner, blogga om röda skor och backstabba varandra på Twitter i en oändlig svartsjuk intrig som utmynnar i ett verbalt ordkrig och mobbingdrev som tillstår en treåring. Å andra sidan finns det många feminiserade Twitter-män där ute som godhetssignalerar som tokiga kastrerade tuppar. Det du säger märks tydligt även på civila arbetsplatser som inte ens kräver någon direkt råstyrka. Inte bara hur män tappar fokus, vilket sker hela tiden eftersom kvinnor hela tiden anspelar på sin sexualitet för påkalla uppmärksamhet och erhålla bekräftelse för att dra fördel av sitt sexuella kapital, men inte bara det utan hur hela gruppdynamiken förändras så fort en kvinna kommer in i gruppen. Den effekten uppstår även om kvinnan är en fet hårig feminist iklädd endast en soppåse som matchar hens intellekt. Förändringen du nämner är knappast optimal i synnerhet inte i ett arbete som är förenat med livsfara, som militära eller polisiära operationer. De få polisiära insatser jag studerat där har poliskvinnorna i princip stått och tittat på medan männen går in och plockar ut den potentiellt farliga mannen. Först när männen redan brottat ner och fängslat personen kommer poliskvinnan och spelar kaxig och får en klapp på axeln av sin manliga kollega. Hon kände sig stolt i alla fall och det räknas väl kanske. Det är så det ser ut i verkligheten, å andra sidan kan även två storvuxna polismän ha vissa svårigheter att hålla fast en hysterisk och skogstokig tjej på ett säkert sätt. Kvinnor har, generellt sett, inte samma edge med fysisk och mental styrka men inte heller testosteron. Kvinnors generella mentala styrka är konfigurerad annorlunda än mäns och måste anses vara vida överlägsen mäns på andra plan. Kvinnor i krig är antagligen väldigt lämpade att sköta uppgifter som kräver simultanförmåga och som sista försvarslinje på hemmaplan. Det finns alltid några superkvinnor som överträffar de flesta män på alla plan men de är extremt få. Gaddafi hade kvinnliga krypskyttar och Mossad använder kvinnliga operatörer, fast de används mest i syfte att just distrahera och förleda fiendemän på motståndarsidan. Propagandafilmen som SVT köpte in med norska "specialstyrkor" bestående av kvinnor där orkade kvinnorna inte ens göra några enkla pullups ordentligt. Dessa kvinnor är kanske inget de borde skicka till Irak eller Afghanistan, menar även SAS fuckar upp det ibland och blir tillfångatagna och dödade, men konsekvenserna som kvinna att bli tillfångatagen bakom fiendens linjer kan bara sluta på ett sätt. Soyapojkarna från Söder är värre dock, så om man låg medvetslös i ett brinnande hus blir nog hellre bli hjälpt av inkvoterad kvinnlig nagelskulptris alla dagar i veckan framför en hen-figur som flyr lågorna med ett gällt skrik, iförd en lädermask och dildo i arslet. Jämställdhetssatsningen inom räddningsstjänsten är annars helt fantastisk. 2014 kunde man bland annat läsa att "En 27-årig man som 2003 dömdes för ett brutalt rån mot ett danskt tältande par, där kvinnan även gruppvåldtogs medan hennes pojkvän tvingades titta på, får nu anställning som brandman vid Räddningstjänsten Syd." Det här arma landet.
          C++ Python Software Developers (PySide2 Qt Mission Systems)      Cache   Translate Page   Web Page Cache   
VA-Dulles, C+ Python Software Developers (PySide2 Qt Mission Systems) Aerospace Dulles, VA (6 mos open contract) *OOP C+ Python PySide2 GUI Graphic User Interface Qt JavaScript JQuery Angular HTML5 CSS3 Node.js REST RESTful APIs Make CMake Visual Studio OpenSceneGraph SQL DoD Aerospace Spacecraft Ground Systems Mission Systems UAVs* Please send your updated resume to: Bob.Russ@InSourceTechnical.com ASAP befo
          (USA-TX-Irving) Tableau Developer      Cache   Translate Page   Web Page Cache   
**Tableau Developer in Irving, TX at Volt** # Date Posted: _8/9/2018_ # Job Snapshot + **Employee Type:** Contingent + **Location:** 2101 West John Carpenter Freeway Irving, TX + **Job Type:** Software Engineering + **Duration:** 12 weeks + **Date Posted:** 8/9/2018 + **Job ID:** 130965 + **Pay Rate** $0.0 - $32.45/Hour + **Contact Name** Volt Branch + **Phone** 919-782-7440 # Job Description Volt is working with a leading Insurance company to find motivated Tableau Developers in Irving, TX to create Tableau presentations based on discussing the needs and pain points for Business Leaders throughout this company’s enterprise. If you are interested in learning more about this position, please apply. **Are you a fit?** Do you have experience with technology development? Do you like learning about new businesses? Do you have experience/training in using data analysis and quantitative modeling techniques (e.g. statistical, optimization, demand forecasting, and simulation)? As a tableau developer, you will be creating and maintaining campaign data requirements and ad hoc databases, act as department data steward, and collaborate with IT and stakeholders to ensure continuity and consistency. # Assignment Generalities: + Work collaboratively with various business partners to develop common approaches to campaign evaluation and data collection/processing. + Develop interactive dashboards using Tableau, SQL and ETL tools to provide on-demand reporting, powerful visualizations and insights to senior leaders. + Automate data transfers and dashboard updates. + Perform proactive and ad hoc analyses ranging from identifying partner opportunities, evaluating success by assessing the contribution of other related functions which impact partnership performance. # Requirements: + Bachelor’s degree preferred (specialization in data science or quantitative field preferred) + Minimum of 3-5 years of experience in handling duties as detailed above + SQL skills are required, with experience working on at least one of Oracle, SQL server or Big Data. + Experience/training in using data analysis and quantitative modeling techniques (e.g. statistical, optimization, demand forecasting, and simulation) to answer business questions and to assess the added value of recommendations. + Experience developing dashboards and reporting using common data visualization tools within Tableau + Experience with data ETL (SQL, Alteryx, SAS) and coding using one or more statistical computer languages (R, Python, SAS.) to manipulate data and draw insights from large data sets. + Adept at presenting insights and analyses to any level of an organization. + Demonstrated ability to take the initiative, be self-driven, work across functional groups, build collaborative relationships and drive projects to closure. + Tableau Desktop Associate certification or equivalent QlikView Business Analyst certification is preferred # Volt is an equal opportunity employer. **Pay is based on experience.** In order to promote this harmony in the workplace and to obey the laws related to employment, Volt maintains a strong commitment to equal employment opportunity without unlawful regard to race, color, national origin, citizenship status, ancestry, religion (including religious dress and grooming practices), creed, sex (including pregnancy, childbirth, breastfeeding and related medical conditions), sexual orientation, gender identity, gender expression, marital or parental status, age, mental or physical disability, medical condition, genetic information, military or veteran status or any other category protected by applicable law.
          (USA-IL-Mossville) Hardware In Loop Test Engineer      Cache   Translate Page   Web Page Cache   
Hardware In Loop Test Engineer in Mossville, IL at Volt # Date Posted: _8/9/2018_ # Job Snapshot + **Employee Type:** Contingent + **Location:** 14009 North Old Galena Road Mossville, IL + **Job Type:** Software Testing + **Duration:** 100 weeks + **Date Posted:** 8/9/2018 + **Job ID:** 131199 + **Contact Name** Volt Branch + **Phone** 309/495-2062 # Job Description **Test Engineer.** This is a software testing and test infrastructure development position. It involves the development of test hardware, automation infrastructure, test scripts, and the testing for control and/or information product software for a variety Caterpillar products. Job Duties / Responsibilities include but not limited to: + - Working with process partners in defining new or enhanced test functionality for new electronic architecture + - Communicate results to team and process partners and assist with the resolution of any issues + - Develop and build new and innovative test system solutions + - Create scripts for automated software testing + - Execute automated testing and automate data analysis of test results **Minimum Required Qualifications:** + - Experience programming in ANSI C + - Strong communication, interpersonal, and collaboration skills + - Strong analytical skills + Work experience in at least 2 of the listed desired qualifications. The more the better. **Desired Qualifications:** + - Experience/knowledge with test automation + - Experience/knowledge of full-size or mid-size dSpace HIL test system or NI PXI HIL system in testing embedded code on ECM’s (ECU’s) + - Experience/knowledge of construction equipment + - Experience/knowledge of wiring schematics + - Experience working with datalinks (CAN, LAN, USB) + - Experience creating physics based models using Matlab/Simulink + - Experience developing machine motion relation control systems + - Experience using version control and product life cycle management tools + - Experience programming in Python. Volt is a publicly owned Corporation with shares trading on the over-the-counter market: http://www.otcmarkets.com under the trading symbol "VISI". To learn more about Volt, please visit: http://www.volt.com and to see more of our job postings, please visit: http://jobs.volt.com **_*Send your resume today for consideration!! We look forward to talking to you.*_** **Volt is an equal opportunity employer.** To learn more about us, please go to www.volt.com. In order to promote this harmony in the workplace and to obey the laws related to employment, Volt maintains a strong commitment to equal employment opportunity without unlawful regard to race, color, national origin, citizenship status, ancestry, religion (including religious dress and grooming practices), creed, sex (including pregnancy, childbirth, breastfeeding and related medical conditions), sexual orientation, gender identity, gender expression, marital or parental status, age, mental or physical disability, medical condition, genetic information, military or veteran status or any other category protected by applicable law.
          (USA-CA-Lake Forest) Sr. Systems Engineer      Cache   Translate Page   Web Page Cache   
**Sr.** Systems Engineer in Lake Forest, CA at Volt # Date Posted: _8/9/2018_ # Job Snapshot + **Employee Type:** Contingent + **Location:** 25892 Towne Centre Drive North Lake Forest, CA + **Job Type:** Computer Industry + **Duration:** 0 weeks + **Date Posted:** 8/9/2018 + **Job ID:** 131202 + **Pay Rate** $80000.0 - $100000.0/Year + **Contact Name** Volt Branch + **Phone** 760/710-3674 # Job Description This position requires senior skills in server technologies with extensive practical experience planning, implementing, verifying and troubleshooting local on-prem and cloud platform including Azure, VMware, Cisco and more. The candidate for this role should have at a minimum the following qualifications: + 3-5 years of experience managing Azure cloud technologies and current Microsoft certification + 7-10 years of experience as a Microsoft Systems Engineer preferably with an MCSE certification + Experience with VMWare + Experience with the implementation and/or operations of medium to large-scale enterprise cloud infrastructure supporting a software development team + Knowledge of cloud networking architecture, cloud operations, security, automation and orchestration + Experience with High Availability/Site Recovery design principles + Advanced Knowledge of MS Server, Active Directory, Exchange, Microsoft SQL, Office 365 support and migration + Well-rounded knowledge of storage solutions such as Dell Equallogic, Nimble, Compellent, ISCSI, Fiber-Channel, etc… + Supporting servers such as Dell PowerEdge or HP ProLiant + Well versed in Data Backup and Recovery tools + Comfortable managing Antivirus or similar Enterprise Software tools + Deep understanding of TCP/IP networks and related technologies + Experience implementing and managing network/system monitoring tools + Experience supporting and managing Cisco (WiFi) networks + Ability to independently troubleshoot Cisco routers, Cisco switches, ASA Firewalls, Cisco Wireless equipment at a CCNA level + Available for afterhours support and on-call rotation as needed Scoping, architecting and implementing new LAN/WAN and Cloud (MS/Azure) environments + Diagnosing, troubleshooting and correcting issues with Cisco, LAN/Wireless hardware (Meraki), and Azure PaaS/IaaS workloads + Capturing and analyzing packet traces; assessing OS issues; implements Azure service changes + Participates in developing and generating conceptual, logical and physical network architecture/designs including drawings and device configurations + Participates in implementing enterprise and Cloud based security mechanisms and policies + Some scripting and automation (Python, bash, Linux, open source tools). + Serves as a tier 2 escalation resource for resolution of enterprise issues + Installing and configuring Cisco switching: Workgroup, Campus and Data Center products + Installing and configuring Wi-Fi and working with vendors on Wi-Fi related issues + Installing, configuring Cisco switches/Routers (VSS, OSPF, BGP) + Working closely with Telco providers and Clients on WAN upgrades/implementations + Working closely with DevOPs an DBA on architectural changes to cloud PaaS/IaaS environments + Working closely with colleagues to meet team goals and improve process and practices + Serves as a tier 3 escalation resource for resolution of complex enterprise issues To learn more about Volt, please visit http://www.volt.com and to see more of our job postings, please visit: http://jobs.volt.com Please call 760-7 _10_ -3674 or email ndost@volt.com for any questions. **Volt is an Equal Opportunity Employer** In order to promote this harmony in the workplace and to obey the laws related to employment, Volt maintains a strong commitment to equal employment opportunity without unlawful regard to race, color, national origin, citizenship status, ancestry, religion (including religious dress and grooming practices), creed, sex (including pregnancy, childbirth, breastfeeding and related medical conditions), sexual orientation, gender identity, gender expression, marital or parental status, age, mental or physical disability, medical condition, genetic information, military or veteran status or any other category protected by applicable law.
          (USA-GA-Virtual Office GA 1.08) ETL Consultant-Network Solutions      Cache   Translate Page   Web Page Cache   
This ETL Developer (Engineer II) position in Network Business Intelligence - Engineering Applications (NBI-Apps) will be working in a fun, challenging, fast-paced environment to develop Extract-Transform-Load (ETL) processes which enable the engineering arm of Windstream to function more efficiently and effectively. We are looking for an ETL developer responsible for implementing the programmatic collection and consolidation of data from Windstream systems into an Engineering department RDBMS. Examples of the data categories included include network topology and performance, OSS, financial, parts/purchasing, and billing. All major vendors of RDBMS systems are in use at Windstream. Your primary focus will be development of Extract-Transform-Load logic using CloverETL, Python scripting, and Hadoop data integration and processing packages in addition to migration of legacy scripted solutions to these paradigms. Also included will be database development in DDL/DML (primarily Oracle), and operational support of the ETL software infrastructure and processes. On top of the Oracle database development skills and the ETL tool experience, experience with software development is essential. *_Job Responsibilities:_* * Development of Extract-Transform-Load logic using CloverETL and Python languages and systems to support business intelligence needs. * Migration of legacy scripted solutions to our newer ecosystem of tools (CloverETL, Python). * Database development in DDL and DML (primarily Oracle). * Building reusable code and libraries for future use. * Manage work through Agile tools/methodology, collaborative repositories, issue tracking platforms, and wikis. * Manage projects through to completion. * Effective communications in person and using JIRA, Confluence, email, and chat tools. * Effective collaboration in a dynamic team environment. * Independent project execution with minimal oversight. *_Essential Skills:_* * Extract-Transform-Load methodologies and patterns. * Oracle database development including SQL, DDL, and DML. * Javlin CloverETL development and deployment. Experience with comparable ETL tools (Informatica, Alteryx, MS DTS) will be considered. * Programming in the Bash and Python languages. Experience with comparable languages (Perl, TCL, NodeJS) will be considered. * Proficiency with code versioning tools, such as Git. * Data retrieval from files, web-based APIs, and RDBMS (Oracle, MySQL, MsSQL). * Experience working with large, disparate data sets. * Web Service technologies and APIs (REST, RPC, SOAP, etc.) * Data exchange formats: delimited, fixed-format, XML, JSON, and YAML. * Drive to succeed and improve personally, and in ability to add value to the role, team, and company. * Self-starter, relentlessly curious, resourceful, collaborative, and inventive. * Good team player and communicator. * Highly organized and meticulous. * Positive attitude and the desire to solve problems in elegant and creative ways. *_Desired Skills:_* * Apache Hadoop platform experience – Ambari, Pig, Hive, Hbase, Spark, etc. * Database warehousing and performance tuning experience helpful. * Java development experience. * Familiarity with command line operating systems and shells (Linux, Cisco IOS). * Network programming concepts: IPv4, sockets, SSL, port-forwarding. * Unix/Linux administration. * User experience with JIRA and Confluence. * Tableau visualization experience. Minimum Requirements: College degree in Engineering or a related field and 5-7 years professional level experience with 0-2 years supervisory experience for roles with supervision; or 9 years professional level related Engineering/Technical experience with 0-2 years supervisory experience for roles with supervision; or an equivalent combination of education and professional level related Engineering/Technical experience required. **Primary Location:** **US-Georgia-Virtual Office GA 1.08* **Job Category:** **Engineering* **EEO Statement:** **Employment at Windstream is subject to post offer, pre-employment drug testing. Equal Opportunity Employer including minority/female/disability/veteran; Without regard to** **Requisition ID:** *18002805*
          (USA-WI-Racine) Staff Associate - Software Engineering (Scripting) Role      Cache   Translate Page   Web Page Cache   
This position is for a junior developer engineer job role, under Service Delivery Design Engineering. The developer will be working directly with the Lead Engineers (LE) / Subject Matter Experts (SME) to create and develop solutions on various projects large and small. Developer to understand the work flow of all design services, deploy automation wherever identified. Also, responsible for developing and maintaining the Design Engineering portal. **Roles and Responsibilities:** • Strong working knowledge on scripting/Markup languages PERL, PHP, Python, Angular, ReactJS, JavaScript, jQuery, CSS, Bootstrap, MongoDB and MySQL. • Proficiency with software applications such as Word, Excel, PowerPoint, and Visio • Excellent verbal and written communications skills • Ability to think out of the box and come up with creative solutions • Ability to work in an Agile development environment model • Candidate must be proficient at documenting technical requirements including software requirements and process flows. • Ability to take ownership of the tasks and deliver • Freshers or preferably with Internship experience **Key Competencies and Skills:** Key Competencies and Skills: • Strong working knowledge on scripting/Markup languages PERL, PHP, Python, Angular, ReactJS, JavaScript, jQuery, CSS, Bootstrap, MongoDB and MySQL. • Proficiency with software applications such as Word, Excel, PowerPoint, and Visio • Excellent verbal and written communications skills • Ability to think out of the box and come up with creative solutions • Ability to work in an Agile development environment model • Candidate must be proficient at documenting technical requirements including software requirements and process flows. • Ability to take ownership of the tasks and deliver • Freshers or preferably with Internship experience Desirable Technical skills • Knowledge on basic networking Soft skills • Strong written & verbal communications, presentation & customer interaction skills is required • Should be well organized, detail oriented and is flexible to help meet business requirements • Innovative with capability to adapt to new environments and drive changes successfully • Candidate shall be result oriented and is self-driven for delivering projects independently • Customer focused, presentation skills; strong oral and written communication skills • A Strong team player, analytical thinker & problem solver and delivers with speed. • Ability to work under pressure, to deadlines in a fast pace changing environment • Organized, detail oriented and flexible • Person capable to keep focus on result and take ownership of assigned projects **Education and Qualifications:** Bachelor’s Degree in Computer Sciences or equivalent degree
          (USA-MD-Columbia) Software Engineer/Developer      Cache   Translate Page   Web Page Cache   
This position is for a senior software developer to design and implement SDN (Software Defined Networking) and NFV (Network Function Virtualization) components and interfaces for AT&T Global Public Sector customers in order to intelligently scale network services and enable rapid deployment of advanced cutting-edge features and services, and provide industry leading security, performance and reliability. The network services will be built on the Open Network Automation Platform (ONAP), an open source initiative based on the AT&T ECOMP (Enhanced Control, Orchestration, Management & Policy) project as well as the Open-O (Open Orchestrator) project to bring the capabilities for designing, creating, orchestrating and handling of the full lifecycle management of Virtual Network Functions, Software Defined Networks, and the services that all of these pieces enable. As a developer on an agile Scrum team, you will contribute to technical analysis of AT&T Global Public Sector customer network initiatives, specify detailed requirements and design for the impacted ONAP components, including (a) real time inventory of virtual network resources, (b) orchestration, control and activation of virtual network functions and services, (c) application control and policy management for closed loop control, and (d) network data collection and analytics, implement, and deploy these services onto the customer network. **Required Skills, Certification, Experience, and Education** + 5+ years of software development experience + Proficiency with one or more programming languages such as Python, Java, Javascript, C/C++ Experience developing and interfacing with RESTful APIs + Ability to work in a Linux environment including shell scripting. + Knowledge of basic networking principles such as the IP protocol suite (e.g. TCP, UDP, ICMP, etc.) + Exposure to OpenStack, Docker, Kubernetes and containers + Exposure to virtualization, SDN, Cloud technologies such as TOSCA, YANG, NETCONF, OpenStack, VMware, OpenDaylight + Bachelor’s degree or higher in Computer Science or related technical disciplines **Required Clearance:** Must be a US Citizen and possess Secret Clearance **Desired:** + Experience with NFV and experience in production/operationalization of VNF’s/onboarding on NFVi (Network Function Virtualization infrastructure) platforms. + Exposure to ONAP or AT&T ECOMP platforms with basic understanding of the major components including: Multi-VIM/Cloud, SDN Controller, Application Controller, Virtual Function Controller, A&AI, DCAE/Correlation engine, Service Orchestration, and the ability to apply “policy” across these components in the delivery of a service + Experience working with vCPE (virtual CPE) solutions (e.g. AT&T Flexware) for managed services and software-defined WAN + Some exposure to a commercial network automation platform such as Ciena Blue Planet or Cisco NSO (Network Services Orchestrator) + Experience with automation technologies such as Ansible, Puppet, Chef, Jenkins. + Experience working with open source communities and software AT&T is an Affirmative Action/Equal Opportunity Employer and we are committed to hiring a diverse and talented workforce. EOE/AA/M/F/D/V
          (USA-WI-Racine) Staff Associate - Software Engineering (Scripting) Role      Cache   Translate Page   Web Page Cache   
This position is for a junior developer engineer job role, under Service Delivery Design Engineering. The developer will be working directly with the Lead Engineers (LE) / Subject Matter Experts (SME) to create and develop solutions on various projects large and small. Developer to understand the work flow of all design services, deploy automation wherever identified. Also, responsible for developing and maintaining the Design Engineering portal. **Roles and Responsibilities:** • Strong working knowledge on scripting/Markup languages PERL, PHP, Python, Angular, ReactJS, JavaScript, jQuery, CSS, Bootstrap, MongoDB and MySQL. • Proficiency with software applications such as Word, Excel, PowerPoint, and Visio • Excellent verbal and written communications skills • Ability to think out of the box and come up with creative solutions • Ability to work in an Agile development environment model • Candidate must be proficient at documenting technical requirements including software requirements and process flows. • Ability to take ownership of the tasks and deliver • Freshers or preferably with Internship experience **Key Competencies and Skills:** Key Competencies and Skills: • Strong working knowledge on scripting/Markup languages PERL, PHP, Python, Angular, ReactJS, JavaScript, jQuery, CSS, Bootstrap, MongoDB and MySQL. • Proficiency with software applications such as Word, Excel, PowerPoint, and Visio • Excellent verbal and written communications skills • Ability to think out of the box and come up with creative solutions • Ability to work in an Agile development environment model • Candidate must be proficient at documenting technical requirements including software requirements and process flows. • Ability to take ownership of the tasks and deliver • Freshers or preferably with Internship experience Desirable Technical skills • Knowledge on basic networking Soft skills • Strong written & verbal communications, presentation & customer interaction skills is required • Should be well organized, detail oriented and is flexible to help meet business requirements • Innovative with capability to adapt to new environments and drive changes successfully • Candidate shall be result oriented and is self-driven for delivering projects independently • Customer focused, presentation skills; strong oral and written communication skills • A Strong team player, analytical thinker & problem solver and delivers with speed. • Ability to work under pressure, to deadlines in a fast pace changing environment • Organized, detail oriented and flexible • Person capable to keep focus on result and take ownership of assigned projects **Education and Qualifications:** Bachelor’s Degree in Computer Sciences or equivalent degree
          (USA-TX-Plano) Advertising & Analytics - Principal Data Scientist (AdCo)      Cache   Translate Page   Web Page Cache   
The Data Scientist will be responsible for designing and implementing processes and layouts for complex, large- scale data sets used for modeling, data mining, and research purposes. The purpose of this role is to conceptualize, prototype, design, develop and implement large scale big data science solutions in the cloud and on premises, in close collaboration with product development teams, data engineers and cloud enterprise teams. Competencies in implementing common and new machine learning, text mining and other data science driven solutions on cloud based technologies such as AWS are required. The data scientist will be knowledgeable and skilled in the emerging data science trends and must be able to provide technical guidance to the other data scientists in implementing emerging and advanced techniques. The data scientist must also be able to work closely with the product and business teams to conceptualize appropriate data science models and methods that meet the requirements. Key Roles and Responsibilities + Uses known and emerging techniques and methods in data science (including statistical, machine learning, deep learning, text and language analytics and visualization) in big data and cloud based technologies to conceptualize, prototype, design, code, test, validate and tune data science centric solutions to address business and product requirements + Conceptualizes data science enablers required for supporting future product features based on business and product roadmaps, and guides cross functional teams in prototyping and validating these enablers + Mentors and guides other data scientists + Uses a wide range of existing and new data science and machine learning tools and methods as required to solve the problem on hand. Skilled in frameworks and libraries using but not limited to R, python, spark, scala, pig, hive, mllib, mxnet, tensorflow, keras, theanos etc. + Is aware of industry trends an collaborates with the platform and engineering teams to update the data science development stack for competitive advantage + Collaborates with third party data science capability vendors and provides appropriate recommendations to the product development teams + Works in a highly agile environment **Experience** Typically requires 10 or more years experience or PhD in an approved field with a minimum of 6 years of relevant experience. **Education** Preferred Masters of Science in Computer Science, Math or Scientific Computing; Data Analytics, Machine Learning or Business Analyst nanodegree; or equivalent experience.
          (USA-DC-Washington) Software Engineer/Developer      Cache   Translate Page   Web Page Cache   
This position is for a senior software developer to design and implement SDN (Software Defined Networking) and NFV (Network Function Virtualization) components and interfaces for AT&T Global Public Sector customers in order to intelligently scale network services and enable rapid deployment of advanced cutting-edge features and services, and provide industry leading security, performance and reliability. The network services will be built on the Open Network Automation Platform (ONAP), an open source initiative based on the AT&T ECOMP (Enhanced Control, Orchestration, Management & Policy) project as well as the Open-O (Open Orchestrator) project to bring the capabilities for designing, creating, orchestrating and handling of the full lifecycle management of Virtual Network Functions, Software Defined Networks, and the services that all of these pieces enable. As a developer on an agile Scrum team, you will contribute to technical analysis of AT&T Global Public Sector customer network initiatives, specify detailed requirements and design for the impacted ONAP components, including (a) real time inventory of virtual network resources, (b) orchestration, control and activation of virtual network functions and services, (c) application control and policy management for closed loop control, and (d) network data collection and analytics, implement, and deploy these services onto the customer network. **Required Skills, Certification, Experience, and Education** + 5+ years of software development experience + Proficiency with one or more programming languages such as Python, Java, Javascript, C/C++ Experience developing and interfacing with RESTful APIs + Ability to work in a Linux environment including shell scripting. + Knowledge of basic networking principles such as the IP protocol suite (e.g. TCP, UDP, ICMP, etc.) + Exposure to OpenStack, Docker, Kubernetes and containers + Exposure to virtualization, SDN, Cloud technologies such as TOSCA, YANG, NETCONF, OpenStack, VMware, OpenDaylight + Bachelor’s degree or higher in Computer Science or related technical disciplines **Required Clearance:** Must be a US Citizen and possess Secret Clearance **Desired:** + Experience with NFV and experience in production/operationalization of VNF’s/onboarding on NFVi (Network Function Virtualization infrastructure) platforms. + Exposure to ONAP or AT&T ECOMP platforms with basic understanding of the major components including: Multi-VIM/Cloud, SDN Controller, Application Controller, Virtual Function Controller, A&AI, DCAE/Correlation engine, Service Orchestration, and the ability to apply “policy” across these components in the delivery of a service + Experience working with vCPE (virtual CPE) solutions (e.g. AT&T Flexware) for managed services and software-defined WAN + Some exposure to a commercial network automation platform such as Ciena Blue Planet or Cisco NSO (Network Services Orchestrator) + Experience with automation technologies such as Ansible, Puppet, Chef, Jenkins. + Experience working with open source communities and software AT&T is an Affirmative Action/Equal Opportunity Employer and we are committed to hiring a diverse and talented workforce. EOE/AA/M/F/D/V
          (USA-VA-Vienna) Software Engineer/Developer      Cache   Translate Page   Web Page Cache   
This position is for a senior software developer to design and implement SDN (Software Defined Networking) and NFV (Network Function Virtualization) components and interfaces for AT&T Global Public Sector customers in order to intelligently scale network services and enable rapid deployment of advanced cutting-edge features and services, and provide industry leading security, performance and reliability. The network services will be built on the Open Network Automation Platform (ONAP), an open source initiative based on the AT&T ECOMP (Enhanced Control, Orchestration, Management & Policy) project as well as the Open-O (Open Orchestrator) project to bring the capabilities for designing, creating, orchestrating and handling of the full lifecycle management of Virtual Network Functions, Software Defined Networks, and the services that all of these pieces enable. As a developer on an agile Scrum team, you will contribute to technical analysis of AT&T Global Public Sector customer network initiatives, specify detailed requirements and design for the impacted ONAP components, including (a) real time inventory of virtual network resources, (b) orchestration, control and activation of virtual network functions and services, (c) application control and policy management for closed loop control, and (d) network data collection and analytics, implement, and deploy these services onto the customer network. **Required Skills, Certification, Experience, and Education** + 5+ years of software development experience + Proficiency with one or more programming languages such as Python, Java, Javascript, C/C++ Experience developing and interfacing with RESTful APIs + Ability to work in a Linux environment including shell scripting. + Knowledge of basic networking principles such as the IP protocol suite (e.g. TCP, UDP, ICMP, etc.) + Exposure to OpenStack, Docker, Kubernetes and containers + Exposure to virtualization, SDN, Cloud technologies such as TOSCA, YANG, NETCONF, OpenStack, VMware, OpenDaylight + Bachelor’s degree or higher in Computer Science or related technical disciplines **Required Clearance:** Must be a US Citizen and possess Secret Clearance **Desired:** + Experience with NFV and experience in production/operationalization of VNF’s/onboarding on NFVi (Network Function Virtualization infrastructure) platforms. + Exposure to ONAP or AT&T ECOMP platforms with basic understanding of the major components including: Multi-VIM/Cloud, SDN Controller, Application Controller, Virtual Function Controller, A&AI, DCAE/Correlation engine, Service Orchestration, and the ability to apply “policy” across these components in the delivery of a service + Experience working with vCPE (virtual CPE) solutions (e.g. AT&T Flexware) for managed services and software-defined WAN + Some exposure to a commercial network automation platform such as Ciena Blue Planet or Cisco NSO (Network Services Orchestrator) + Experience with automation technologies such as Ansible, Puppet, Chef, Jenkins. + Experience working with open source communities and software AT&T is an Affirmative Action/Equal Opportunity Employer and we are committed to hiring a diverse and talented workforce. EOE/AA/M/F/D/V
          (USA-NJ-Middletown) Software Engineer/Developer      Cache   Translate Page   Web Page Cache   
This position is for a senior software developer to design and implement SDN (Software Defined Networking) and NFV (Network Function Virtualization) components and interfaces for AT&T Global Public Sector customers in order to intelligently scale network services and enable rapid deployment of advanced cutting-edge features and services, and provide industry leading security, performance and reliability. The network services will be built on the Open Network Automation Platform (ONAP), an open source initiative based on the AT&T ECOMP (Enhanced Control, Orchestration, Management & Policy) project as well as the Open-O (Open Orchestrator) project to bring the capabilities for designing, creating, orchestrating and handling of the full lifecycle management of Virtual Network Functions, Software Defined Networks, and the services that all of these pieces enable. As a developer on an agile Scrum team, you will contribute to technical analysis of AT&T Global Public Sector customer network initiatives, specify detailed requirements and design for the impacted ONAP components, including (a) real time inventory of virtual network resources, (b) orchestration, control and activation of virtual network functions and services, (c) application control and policy management for closed loop control, and (d) network data collection and analytics, implement, and deploy these services onto the customer network. **Required Skills, Certification, Experience, and Education** + 5+ years of software development experience + Proficiency with one or more programming languages such as Python, Java, Javascript, C/C++ Experience developing and interfacing with RESTful APIs + Ability to work in a Linux environment including shell scripting. + Knowledge of basic networking principles such as the IP protocol suite (e.g. TCP, UDP, ICMP, etc.) + Exposure to OpenStack, Docker, Kubernetes and containers + Exposure to virtualization, SDN, Cloud technologies such as TOSCA, YANG, NETCONF, OpenStack, VMware, OpenDaylight + Bachelor’s degree or higher in Computer Science or related technical disciplines **Required Clearance:** Must be a US Citizen and possess Secret Clearance **Desired:** + Experience with NFV and experience in production/operationalization of VNF’s/onboarding on NFVi (Network Function Virtualization infrastructure) platforms. + Exposure to ONAP or AT&T ECOMP platforms with basic understanding of the major components including: Multi-VIM/Cloud, SDN Controller, Application Controller, Virtual Function Controller, A&AI, DCAE/Correlation engine, Service Orchestration, and the ability to apply “policy” across these components in the delivery of a service + Experience working with vCPE (virtual CPE) solutions (e.g. AT&T Flexware) for managed services and software-defined WAN + Some exposure to a commercial network automation platform such as Ciena Blue Planet or Cisco NSO (Network Services Orchestrator) + Experience with automation technologies such as Ansible, Puppet, Chef, Jenkins. + Experience working with open source communities and software AT&T is an Affirmative Action/Equal Opportunity Employer and we are committed to hiring a diverse and talented workforce. EOE/AA/M/F/D/V
          (USA-MA-Bedford) Data Scientist - must be software savvy      Cache   Translate Page   Web Page Cache   
**Data Scientist \- must be software savvy** **Description** MITRE is different from most technology companies\. We are a not\-for\-profit corporation chartered to work for the public interest, with no commercial conflicts to influence what we do\. The R&D centers we operate for the government create lasting impact in fields as diverse as cybersecurity, healthcare, aviation, defense, and enterprise transformation\. We're making a difference every day—working for a safer, healthier, and more secure nation and world\. Join the Data Analytics team where you will provide software development, algorithm development, and data analytics \(to include big data analytics, data mining, and data science\) to enable data\-driven decisions and insights\. Experience with analytic techniques and methods \(e\.g\., supervised and unsupervised machine learning, link analysis, and text mining\) as well as software languages is a must\. Software languages and big data technologies needed include: Java, Python, R, C\#, C, SAS, analytic engines, Hadoop, parallelized analytic algorithms, and NoSQL and massively parallel processing databases\. The successful candidate will have the ability to formulate problems, prototype solutions, and to analyze results\. * Formulate data analytic problems * Get and cleanse data * Employ analytic methods and techniques * Develop analytic algorithms * Analyze data **Qualifications** Required Qualifications Must be a US citizen able to obtain and maintain a DoD clearance Completed BS degree in Computer Science, Data Science, or similar technical degree\. New grads must have strong academic record of 3\.0 GPA\. Experience will include: * Hands\-on software development skills \(Java, R, C , C\#, python, JavaScript\) with analytic applications and technologies\. * Capture and cleansing data raw data, data storage and retrieval \(relational and NoSQL\), data analytics and visualization, and cloud\-based technologies\. * Proficiency with the Map Reduce programming model and technologies such as Hadoop, Hive, and Pig is a plus\. \ * \ * \ * \ * Preference given to candidates with active clearances\. **Job** SW Eng, Comp Sci & Mathematics **Primary Location** United States\-Virginia\-McLean **Other Locations** United States\-Massachusetts\-Bedford **This requisition requires a clearance of** Secret **Travel** Yes, 10 % of the Time **Job Posting** Aug 9, 2018, 11:05:43 AM **Req ID:** 00050915
          (USA-VA-McLean) Data Scientist - must be software savvy      Cache   Translate Page   Web Page Cache   
**Data Scientist \- must be software savvy** **Description** MITRE is different from most technology companies\. We are a not\-for\-profit corporation chartered to work for the public interest, with no commercial conflicts to influence what we do\. The R&D centers we operate for the government create lasting impact in fields as diverse as cybersecurity, healthcare, aviation, defense, and enterprise transformation\. We're making a difference every day—working for a safer, healthier, and more secure nation and world\. Join the Data Analytics team where you will provide software development, algorithm development, and data analytics \(to include big data analytics, data mining, and data science\) to enable data\-driven decisions and insights\. Experience with analytic techniques and methods \(e\.g\., supervised and unsupervised machine learning, link analysis, and text mining\) as well as software languages is a must\. Software languages and big data technologies needed include: Java, Python, R, C\#, C, SAS, analytic engines, Hadoop, parallelized analytic algorithms, and NoSQL and massively parallel processing databases\. The successful candidate will have the ability to formulate problems, prototype solutions, and to analyze results\. * Formulate data analytic problems * Get and cleanse data * Employ analytic methods and techniques * Develop analytic algorithms * Analyze data **Qualifications** Required Qualifications Must be a US citizen able to obtain and maintain a DoD clearance Completed BS degree in Computer Science, Data Science, or similar technical degree\. New grads must have strong academic record of 3\.0 GPA\. Experience will include: * Hands\-on software development skills \(Java, R, C , C\#, python, JavaScript\) with analytic applications and technologies\. * Capture and cleansing data raw data, data storage and retrieval \(relational and NoSQL\), data analytics and visualization, and cloud\-based technologies\. * Proficiency with the Map Reduce programming model and technologies such as Hadoop, Hive, and Pig is a plus\. \ * \ * \ * \ * Preference given to candidates with active clearances\. **Job** SW Eng, Comp Sci & Mathematics **Primary Location** United States\-Virginia\-McLean **Other Locations** United States\-Massachusetts\-Bedford **This requisition requires a clearance of** Secret **Travel** Yes, 10 % of the Time **Job Posting** Aug 9, 2018, 11:05:43 AM **Req ID:** 00050915
          (USA-CA-Pleasanton) Data Scientist      Cache   Translate Page   Web Page Cache   
We are looking for a **Data Scientist to be** a key contributor responsible for designing, developing and maintaining aerospace operational models specific to the Panasonic Avionics Corporation product suite including inflight consumer engagement platforms and inflight connectivity systems. **Major Responsibilities include;** **Data Science** + Advance team’s capability to bring vision to life, support roadmap development and prioritization, which may include items like a strategic KPI framework, customer portfolio optimization and planning, strategic site analysis (segments and pathing), measurement and attribution, AI platform evaluation and journey analytics + Continuously innovate by staying abreast of and bringing recommendations on the latest tools and techniques (and evaluate options) associated with consumer personalization, AI/Machine learning, real time decisioning, and digital analytics. + Iterate quickly in an agile development process. + Support projects from start to finish & produce data-driven results with appropriate techniques to answer key business questions + Use machine learning and predictive modeling to develop data driven solutions that drive substantial business value in key PAC product areas. + Program and support analytic solutions to improve and optimize business performance and minimize risk. + Program and support machine learning algorithms for model training and deployment. + Lead junior team members to develop solutions. **KNOWLEDGE/SKILL REQUIREMENTS** + Understands department’s mission and vision and the ability to execute on that vision + Able to define the correct data, analysis and interpretation to achieve complex design and marketing initiatives. + Experience with cloud solutions for products and services. + Experience with designing, building and managing large scale ML and analytics platforms + Proven technical ability with a variety of tools including SQL, Python and R. Commanding knowledge of statistics and or machine learning techniques. Applications in the game industry a plus. + Knowledge of advanced statistical techniques suitable for analysis of highly skewed populations + Proven experience in predictive analytics, segmentation, experimental design and related areas + Experience designing, deploying and maintaining cloud based big data technology stacks (Amazon Redshift experience preferred) + Experience with traditional Business Intelligence relational database modeling, tools and processes + Familiarity with the design and implementation of data telemetry systems **EDUCATION/EXPERIENCE REQUIREMENTS** + BS in Statistics, Operations Research, Economics or similar degree with a focus on statistical methodology + 10 years+ experience in data science and analytics environment; should include experience in consumer/CRM analytics methods, measurement, attribution, test planning and rapid testing, strong knowledge of media analytics and addressable media measurement and testing, digital analytics, some B2B2C experience or knowledge, omni-channel lifecycle marketing orientation + Strong communication and collaboration skills with ability to build consensus and drive cross-functional teams forward to execution against project goals and timelines + Experience with statistical modeling, machine learning, digital analytics, media analytics. + In depth specialization in mathematical analysis methods, predictive modeling, statistical analysis, machine learning, and technologies like Python, R and Hadoop. Or equivalent experience. + Experience with time-series models Bayesian modeling, Generalized Linear Models and/or Limited Dependent Variables + Expertise with R, SAS, Python, Hadoop, DMPs, and digital platforms + Proven ability to design and code new algorithms from scratch\#LI-POST
          SQLCLR vs SQL Server 2017, Part 8: Is SQLCLR Deprecated in Favor of Python or R (sp_execute_external_script)?      Cache   Translate Page   Web Page Cache   

With the additional (and annoying) configuration step required to get SQLCLR Assemblies to load starting in SQL Server 2017, some people have been wondering what is going on with SQLCLR. Considering that this new restriction is the only real change to SQLCLR since SQL Server 2012 (three versions ago), then with (limited) support for languages such as R (starting in SQL Server 2016) and Python (starting in SQL Server 2017) being added, it might even look like SQLCLR is being deprecated (i.e. phased-out) in favor of these new languages.

Could this be true? There is no official indication, but could it be unofficially / “effectively” deprecated? Well, let’s take a look.

Investigation

How do we know / determine that a feature or product is deprecated?

Officially Deprecated

Acccording to the Deprecated Database Engine Features in SQL Server 2017 documentation:

When a feature is marked deprecated, it means:

  • The feature is in maintenance mode only. No new changes will be done, including those related to inter-operability with new features.
  • We strive not to remove a deprecated feature from future releases to make upgrades easier. However, under rare situations, we may choose to permanently remove the feature from SQL Server if it limits future innovations.
  • For new development work, we do not recommend using deprecated features.

For example, in that same documentation, in the Features deprecated in a future version of SQL Server section, you can see that “Extended stored procedure programming” is officially deprecated. And, interestingly enough, the Replacement is:

Use CLR Integration instead.

And, in the Database Engine Extended Stored Procedures – Reference documentation, it has a notice at the top stating:

Important
This feature will be removed in a future version of Microsoft SQL Server. Do not use this feature in new development work, and modify applications that currently use this feature as soon as possible. Use CLR integration instead.

So, “officially deprecated” means that it is marked as such in the documentation, and where applicable, will show up in a performance counter, such as SQLServer:Deprecated Features.

Effectively Deprecated

There are also features that are not marked as deprecated in the documentation (in the “Deprecated Features” list) and do not show up in the SQLServer:Deprecated Features performance counter, yet still do meet one, or both, of the criteria noted above, namely:

  1. The feature is in maintenance mode only.
  2. The official recommendation is to not use the feature in new development.

Take, for example, the OLE Automation Stored Procedures (i.e. the sp_OA* procs, such as sp_OACreate, sp_OAMethod, etc). These are not in the “Deprecated Features” list, nor does the documentation for them recommend not using them (though, hopefully, you will never find someone who recommends using them). However, they are not being updated (they do not support any data types added in SQL Server 2005 or newer: MAX types, XML, etc), and they increment the SQLServer:Deprecated Features performance counter (for instance_name = 'XP_API' , at least as of SQL Server 2017). Instead of using OLE Automation stored procedures, you should either use SQLCLR, or handle the functionality outside of SQL Server.

Another example of an “effectively” deprecated feature are the SQL Server Collations (i.e. Collations having names starting with SQL_ ). These are hold-overs to provide compatibility with pre-SQL Server 2000 (which introduced the Windows Collations: names not starting with SQL_ ). These are also not in the “Deprecated Features” list, nor do they increment the SQLServer:Deprecated Features performance counter. However, they are not being updated, and the documentation for SQL Server Collation Name recommends against using them:

SQL Server collations are still supported for backward compatibility, but should not be used for new development work.

And yes, if at all possible, do not use Collations with names starting with SQL_ 1.

Not Deprecated

One feature that does not fit the desciption of “deprecated” is SQLCLR:

  1. SQLCLR is still being used internally for some newer built-in functions, such as the following (both starting in SQL Server 2016):
  2. The sp_execute_external_script stored procedure, used to execute R and Python scripts, cannot be a replacement for SQLCLR because it cannot do the following:
    1. Use the data types:
      • XML
      • DATETIME2
      • DATETIMEOFFSET
      • TIME
      • SQL_VARIANT
      • HierarchyID
      • Geometry
      • Geography
      • custom SQLCLR UDTs
    2. Create User-Defined Aggregates (UDA)
    3. Create User-Defined Types (UDT)
    4. Create Scalar Functions / UDF
    5. Create Table-Valued Functions (TVF)
    6. Create Triggers (as a stored procedure, sp_execute_external_script can be executed within a Trigger, but it most likely doesn’t have access to the inserted and deleted tables, while a SQLCLR Trigger does)
    7. Return results with named columns (results set columns from sp_execute_external_script are all unnamed, so if you need the result set columns to have names, you need to use the WITH RESULT SETS clause, which cannot be dynamic unless you put the entire call into Dynamic SQL, limiting your ability to return named columns if the columns and their datatypes are not known ahead of time)
    8. Access the current SPID and transaction via the in-process connection (i.e. access to local temporary objects, CONTEXT_INFO, SESSION_CONTEXT, etc ; context connection = true; )
    9. Impersonate the caller’s Windows Account (when accessing external resources ; at least it is highly unlikely that this would be possible)
    10. Restrict access to certain methods / code (SQLCLR methods are exposed through T-SQL objects which have their own permissions, whereas sp_execute_external_script is just like xp_cmdshell and the OLE Automation stored procedures in that you cannot restrict what code people pass into them).
       
  3. Finally, as we saw in the “Officially Deprecated” section above, SQLCLR is the recommended replacement for the Extended Stored Procedure API. It is also the unofficially recommended replacement for the OLE Automation stored procedures (i.e. sp_OA* ).

Conclusion

Microsoft is a company like most others. There is always more to do than there are resources available to accomplish everything. There are budgets, time contraints, priorities, and so on. SQLCLR is a feature like most others. Some people like it, some people don’t, and some have never even heard of it. It is great for some scenarios, not so great for others. It has been used to solve complex problems rather efficiently, and in other cases it has been horribly misused to create slow, convoluted technical debt that is used by anti-SQLCLR curmudgeons as validation of their opinion.

Sure, there are features that only survive for a version or two (e.g. vardecimal, introduced in SQL Server 2005, and then deprecated in the following version, 2008). But, there are plenty of features (even entire products, I would assume) that are valid and useful yet have not been improved nearly as much as some would like. While this is certainly frustrating, it does not indicate / imply the death of the feature (or product).

Also, just because a company comes out with a new product / feature that can do some of the same things as an existing product / feature does not necessarily mean that something is being replaced. This is especially true if the new product / feature does not do all (or most) of the same things. I remember back in 2011 or 2012 Microsoft either came out with something to do JavaScript on the server, or there was at least talk of such a thing (perhaps TypeScript?). And sure enough, there were folks who were predicting the end of C# / .NET, completely ignoring reality and the implications of replacing it.

So, while there are things that can certainly be improved with SQLCLR, and while it is frustrating that no resources are being devoted to it, there is no evidence to suggest that SQLCLR is being deprecated, even unofficially.


  1. Instead of using SQL_Latin1_General_CP1_CI_AS, use:
    * Latin1_General_CI_AS (if you are on SQL Server 2005)
    * Latin1_General_100_CI_AS (if you are on SQL Server 2008 or 2008 R2)
    * Latin1_General_100_CI_AS_SC (if you are on SQL Server 2012 or newer) 

          3ds max exporter - "Couldn't authorize: The request was aborted: Could not create SSL/TLS secure channel."      Cache   Translate Page   Web Page Cache   

Clearly the problem is associated to the Server Name Indication not supported on Python2 < 2.7.7 (Ubuntu 14.04 uses 2.7.6)

None of the workarounds worked for me, so I moved to Python3 to get it working again.


          3ds max exporter - "Couldn't authorize: The request was aborted: Could not create SSL/TLS secure channel."      Cache   Translate Page   Web Page Cache   

Just tried your latest Python code from API v2 and forced upgrade of my requests python library. Still no go, but longer error report:

Blockquote
Uploading …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl_.py:369: SNIMissingWarning: An HTTPS request has been made, but the SNI (Server Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
SNIMissingWarning
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl_.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
An error occured: HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Start polling processing status for model None
Try polling processing status (attempt #0) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #1) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #2) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #3) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #4) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #5) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #6) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #7) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #8) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, 'ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Try polling processing status (attempt #9) …
/usr/local/lib/python2.7/dist-packages/urllib3/util/ssl
.py:160: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Try failed with error HTTPSConnectionPool(host=‘api.sketchfab.com’, port=443): Max retries exceeded with url: /v2/models/None/status?token=b209f14fb01f468bb19ccc9e4010142b (Caused by SSLError(SSLError(1, ‘_ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure’),))
Stopped polling after too many retries or too many errors

Blockquote


          3ds max exporter - "Couldn't authorize: The request was aborted: Could not create SSL/TLS secure channel."      Cache   Translate Page   Web Page Cache   

Python exporter stopped working too.

This is the error I get:

An error occured: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
Traceback (most recent call last):
File “./send_sketchfab.py”, line 223, in
x=model_url.find(“models”)
AttributeError: ‘NoneType’ object has no attribute ‘find’


          Pick a serverless fight: A comparative research of AWS Lambda, Azure Functions ...      Cache   Translate Page   Web Page Cache   

The saturation point is nowhere to be seen in the serverless discussion with tons ofnews coming online every day andnumerous reports trying to take the pulse of one of the hottest topics out there.

This time, however, we are not going to discuss any of the above. This article is going to be a bit more…academic!

During the last USENIX Annual Technical Conference ’18 that took place in Boston, USA in mid-July, an amazingly interesting academic research was presented .

The paper “Peeking Behind the Curtains of Serverless Platforms” is a comparative research and analysis of the three big serverless providers AWS Lambda, Azure Functions and Google Cloud Functions. The authors (Liang Wang, Mengyuan Li, Yinqian Zhang, Thomas Ristenpart, Michael Swift) conducted the most in-depth (so far) study of resource management and performance isolation in these three providers.

SEE ALSO: The state of serverless computing: Current trends and future prospects

The study systematically examines a series of issues related to resource management including how quickly function instances can be launched, function instance placement strategies, and function instance reuse. What’s more, the authors examine the allocation of CPU, I/O and network bandwidth among functions and the ensuing performance implications, as well as a couple of exploitable resource accounting bugs .

Did I get your attention now?

In this article, we have an overview of the most interesting results presented in the original paper.

Let’s get started!

Methodology

First things first. Let’s have a quick introduction to the methodology of this study.

The authors conducted this research by integrating all the necessary functionalities and subroutines into a single function that they call a measurement function .

According to the definition found in the paper, this function performs two tasks:

Collect invocation timing and function instance runtime information Run specified subroutines (e.g., measuring local disk I/O throughput, network throughput) based on received messages

In order to have a clear overview of the specifications for each provider, the following table provides a comparison of function configuration and billing in the three services.


Pick a serverless fight: A comparative research of AWS Lambda, Azure Functions  ...

The authors examined how instances and VMs are scheduled in the three serverless platforms in terms of instance coldstart latency, lifetime, scalability, and idle recycling and the results are extremely interesting.

Scalability and instance placement

One of the most intriguing findings, in my opinion, is on the scalability and instance placement of each provider. There is a significant discrepancy among the three big services with AWS being the best regarding support for concurrent execution :

AWS:“3,328MB was the maximum aggregate memory that can be allocated across all function instances on any VM in AWS Lambda. AWS Lambda appears to treat instance placement as a bin-packing problem, and tries to place a new function instance on an existing active VM to maximize VM memory utilization rates.”

Azure:Despite the fact that Azure documentation states that it will automatically scale up to at most 200 instances for a single Nodejs-based function and at most one new function instance can be launched every 10 seconds, the tests of Nodejs-based functions performed by the authors showed that “at most 10 function instances running concurrently for a single function”, no matter how the interval between invocations were changed.

Google:Contrary to what Google claims on how HTTP-triggered functions will scale to the desired invocation rate quickly, the service failed to provide the desired scalability for the study. “In general, only about half of the expected number of instances, even for a low concurrency level (e.g., 10), could be launched at the same time, while the remainder of the requests were queued.”

Interesting fact: More than 89% of VMs tested achieved 100% memory utilization.

Coldstart and VM provisioning

Concerning coldstart (the process of launching a new function instance) and VM provisioning, AWS Lambda appears to be on the top of its game :

AWS:Two types of coldstart events were examined: “a function instance is launched (1) on a new VM that we have never seen before and (2) on an existing VM. Intuitively, case (1) should have significantly longer coldstart latency than (2) because case (1) may involve starting a new VM.” However, the study shows that “case (1) was only slightly longer than (2) in general. The median coldstart latency in case (1) was only 39 ms longer than (2) (across all settings). Plus, the smallest VM kernel uptime (from /proc/uptime) that was found was 132 seconds, indicating that the VM has been launched before the invocation.” Therefore, these results show that AWS has a pool of ready VMs! What’s more, concerning the extra delays in case (1), the authors argue that they are “more likely introduced by scheduling rather than launching a VM.”

Azure:According to the findings, it took much longer to launch a function instance in Azure, despite the fact that their instances are always assigned 1.5GB memory. The median coldstart latency was 3,640 ms in Azure.

Google:“The median coldstart latency in Google ranged from 110 ms to 493 ms. Google also allocates CPU proportionally to memory, but in Google memory size has a greater impact on coldstart latency than in AWS.”

SEE ALSO: What do developer trends in the cloud look like?

Additional to the tests described above, the research team “collected the coldstart latencies of 128 MB, python 2.7 (AWS) or Nodejs 6.* (Google and Azure) based functions every 10 seconds for over 168 hours (7 days), and calculated the median of the coldstart latencies collected in a given hour.” According to the results, “the coldstart latencies in AWS were relatively stable, as were those in Google (except for a few spikes). Azure had the highest network variation over time, ranging from about 1.5 seconds up to 16 seconds.” Take a look at the figure below:


Pick a serverless fight: A comparative research of AWS Lambda, Azure Functions  ...

Source: “Peeking Behind the Curtains of Serverless Platforms”, Figure 8, p. 139

Instance lifetime

The research team defines as instance lifetime “the longest time a function instance stays active.

Keeping in mind that users prefer the longer lifetimes, the results depict Azure winning this one since Azure functions provide significantly longer lifetimes than AWS and Google, as you can see in the figures below:


Pick a serverless fight: A comparative research of AWS Lambda, Azure Functions  ...

Source: “Peeking Behind the Curtains of Serverless Platforms”, Figure 9, p.140

Idle instance recycling

Instance maximum idle time is defined by the authors as “the longest time an instance can stay idle before getting shut down.” Specifically for each service provider, the results show:

AWS:An instance could usually stay inac
          詳説GraalVM(3) Polyglot      Cache   Translate Page   Web Page Cache   

今回は他言語実行環境としてのGraalVM、Polyglotに着目します。現時点での最新版、1.0.0-rc5を使います。

http://www.graalvm.org/downloads/

なお、rc5からCommunity EditionがmacOSに対応しました(今まではEnterprise Editionだけでした)。こちらを使います。

$ java -version
openjdk version "1.8.0_172"
OpenJDK Runtime Environment (build 1.8.0_172-20180626105433.graaluser.jdk8u-src-tar-g-b11)
GraalVM 1.0.0-rc5 (build 25.71-b01-internal-jvmci-0.46, mixed mode)

Polyglot Shell

$ $JAVA_HOME/bin/polyglot --shell
GraalVM MultiLanguage Shell 1.0.0-rc5
Copyright (c) 2013-2018, Oracle and/or its affiliates
  JavaScript version 1.0
Usage: 
  Use Ctrl+L to switch language and Ctrl+D to exit.
  Enter -usage to get a list of available commands.
js> 

polyglotコマンドにshellオプションを付けると、REPLが立ち上がります。Ctrl+Lで言語が切り替えられるのですが、押しても切り替わらず何も起こらないでしょう。

これは、GraalVMにデフォルトで入っているのはJavaScriptだけだからです。他の言語は自分でインストールします。

このままでも、JavaScriptとJavaの相互運用はできます。相互運用する場合、--jvmオプションが必要です。

$ $JAVA_HOME/bin/polyglot --shell --jvm
js> var BigInteger = Java.type('java.math.BigInteger');
js> BigInteger.valueOf(2).pow(100).toString(16)
10000000000000000000000000

guコマンドでの言語インストール

GraalVMにはgu (Graal updater) コマンドがあり、guコマンドで言語をコンポーネントとしてインストールできます。提供されている言語を見ます。

$ $JAVA_HOME/bin/gu available
Downloading: Component catalog
ComponentId              Version             Component name
----------------------------------------------------------------
python                   1.0.0-rc5           Graal.Python
R                        1.0.0-rc5           FastR
ruby                     1.0.0-rc5           TruffleRuby

python、R、rubyがあります。rubyをインストールします。

$ $JAVA_HOME/bin/gu install ruby
Downloading: Component catalog
Processing component archive: Component ruby
Downloading: Component ruby
Installing new component: TruffleRuby (org.graalvm.ruby, version 1.0.0-rc5)
...

$ $JAVA_HOME/bin/gu list
ComponentId              Version             Component name
----------------------------------------------------------------
ruby                     1.0.0-rc5           TruffleRuby

rubyがあります。

RubyとJava、JavaScriptとの相互運用

polyglotシェルを起動し、Ctrl+Lするとrubyに切り替わります。

$ $JAVA_HOME/bin/polyglot --shell --jvm
ruby>

JavaScriptと同様、rubyとJavaで相互運用できます。

ruby> BigInteger = Java.type('java.math.BigInteger')
ruby> BigInteger.valueOf(2).pow(100).toString(16)
"10000000000000000000000000"

JavaScriptからRubyを呼び出してみます。

js> var array = Polyglot.eval("ruby", "[1,2,42,4]")
js> array[2]
42

RubyからJavaScript。

ruby> array = Polyglot.eval('js', '[1,2,42,4]')
<foreign>
ruby> array[2]
undefined local variable or method `array' for main:Object

polyglotシェルでは他言語の戻り値をうまく扱えないようです。こう書くと動きます。

ruby> array = Polyglot.eval('js', '[1,2,42,4]'); array[2]
42

rbファイルに書く場合はきちんと動作します。

不明点

呼び出し側の変数をeval内で参照する方法。

js> var s = "100";
js> Polyglot.eval('ruby', 's.to_i')
undefined local variable or method `s' for main:Object (NameError)

たとえばここでは、JavaScriptの変数sをRubyのeval内で参照したい。Javadocを見たけれど、うまく見つからなかった。

ソースファイルから実行する

test.rbとして保存します。

array = Polyglot.eval('js', '[1,2,42,4]'); 
puts array[2]

rubyコマンドはgraalvm-ce-1.0.0-rc5/Contents/Home/jre/languages/ruby/bin/rubyにあります。

$ graalvm-ce-1.0.0-rc5/Contents/Home/jre/languages/ruby/bin/ruby -v
truffleruby 1.0.0-rc5, like ruby 2.4.4, GraalVM CE Native [x86_64-darwin]

実行します。他言語と相互運用する場合、--jvmオプションが必要です。

$ graalvm-ce-1.0.0-rc5/Contents/Home/jre/languages/ruby/bin/ruby --jvm test.rb
42

デバッガ

GraalVMには多言語環境に対応したデバッガがあります。なんとChromeのDevToolsを利用しています。--inspectオプションをつけます。

$ graalvm-ce-1.0.0-rc5/Contents/Home/jre/languages/ruby/bin/ruby --jvm --inspect test.rb
Debugger listening on port 9229.
To start debugging, open the following URL in Chrome:
    chrome-devtools://devtools/bundled/js_app.html?ws=127.0.0.1:9229/4cc77c2e-56cbc17f25aee

Chromeにchrome-devtools://devtools/bundled/js_app.html?ws=127.0.0.1:9229/4cc77c2e-56cbc17f25aeeをそのまま入力します。

f:id:jyukutyo:20180808142835p:plain

ブレークポイントも設定できます。

f:id:jyukutyo:20180808142935p:plain

その他変数の中身を見るなど、デバッガ機能が利用できます。

f:id:jyukutyo:20180808143025p:plain


          Remove Python list item      Cache   Translate Page   Web Page Cache   

I have two list,

l1 = [1,2,3,4,5,6] l2 = [3,2]

what i want is to remove the element of list l1 which is in l2, for that i have done something like this,

for x in l1: if x in l2: l1.remove(x)

it gives output like

[1, 3, 4, 5, 6]

but the output should be like

[1, 4, 5, 6]

can any one shed light on this.

This is easily explained like this.

consider the first array you have:

| 1 | 2 | 3 | 4 | 5 | 6 |

Now you start iterating

| 1 | 2 | 3 | 4 | 5 | 6 | ^

Nothing happens, iterator increments

| 1 | 2 | 3 | 4 | 5 | 6 | ^

2 gets removed

| 1 | 3 | 4 | 5 | 6 | ^

iterator increments

| 1 | 3 | 4 | 5 | 6 | ^

And voila, 3 is still there.

The solution is to iterate ove a copy of the vector e.g.

for x in l1[:]: <- slice on entire array if x in l2: l1.remove(x)

or to iterate in reverse:

for x in reversed(l1): if x in l2: l1.remove(x)

Which acts like this:

| 1 | 2 | 3 | 4 | 5 | 6 | ^ | 1 | 2 | 3 | 4 | 5 | 6 | ^ | 1 | 2 | 4 | 5 | 6 | ^ | 1 | 2 | 4 | 5 | 6 | ^ | 1 | 4 | 5 | 6 | ^ | 1 | 4 | 5 | 6 | ^


          Codementor: Better with Python: Collections      Cache   Translate Page   Web Page Cache   

So you want to get better with python? One of the best things you can do to practice is get to know the Python standard library. This is the collection of modules that come packaged with the language and it includes practically everything you need to build applications. Whether it's servers, parsers, or email applications, it's got you covered.

One of my favorite modules from the standard library is collections . It's got a couple of key tools for common, everyday Python programming. It provides you with specialized dictionaries, lists and tuples.

Today, I'm going to talk about namedtuple s.

When I first learned about them, I was building endpoints for an API for my team. The problem I was trying to solve was this: how do I create specialized object types to aid with organizing my code -- without creating new classes?

The namedtuple provided a very elegant solution. Here's a simple example of how you might use them:

from collections import namedtuple ApiRequest = namedtuple('ApiRequest', ['status', 'method', 'details']) response = ApiRequest(200, 'GET', 'Request was awesome!') response.status # => 200

The equivalent class code would be:

class ApiResponse: def __init__(self, status, method, details): self.status = status self.method = method self.details = details response = ApiResponse('201', 'GET', 'Document created.')

As you can see, the namedtuple is something of a shorthand for creating classes of objects, but it's got some great features out of the box that ordinary classes don't have. Some benefits I was looking for:

Immutable: I needed objects that would never be altered. Identifiers: I needed objects whose attributes were simple to access. Reusable: It had to be something others on my team could quickly re-use and understand at a glance.

The namedtuple is very Pythonic; it makes your code more readable, efficient and brief. You'll also hear the term "self-documenting" used to describe them, which is what makes them easy to understand at a glance. You and your team don't have to go to a different file and read class definitions to know what the code is doing or how to use it.

Now, I could've used a dictionary to handle the data instead, but I'd have to use bracket notation or dictionary methods like dict.get() . That's not a great hassle, but it's not as elegant as the namedtuple and it doesn't provide immutability, identifiers or ease of use.

Plus, a namedtuple can be easily converted to a dictionary if needed with a handy helper method: namedtuple._asdict()

In addition to these benefits, there's more. A namedtuple can be subclassed, meaning you could add methods to them.

class CustomRequest(ApiRequest): def is_success(self): if self.status == 200: return True return False response = CustomRequest(200, 'The request had no problems.') response.is_success # => True

The next time you're building something with Python or refactoring code, think about how you might use the collections.namedtuple to make your code more pythonic, efficient and clean.

Resources: Official documentation
          此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!      Cache   Translate Page   Web Page Cache   

此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!

通过用JS在本地生成随机字符串的反爬虫机制,在利用python写爬虫的时候经常会遇到的一个问题。希望通过讲解,能为大家提供一种思路。以后再碰到这种问题的时候知道该如何解决。(如果缺乏学习资料的同学,文末已经给你提供!)

破解有道翻译反爬虫机制

web端的有道翻译,在之前是直接可以爬的。也就是说只要获取到了他的接口,你就可以肆无忌惮的使用他的接口进行翻译而不需要支付任何费用。那么自从有道翻译推出他的API服务的时候,就对这个接口做一个反爬虫机制。这个反爬虫机制在爬虫领域算是一个非常经典的技术手段。那么他的反爬虫机制原理是什么?如何破解?接下来带大家一探究竟。


此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
一、正常的爬虫流程:
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!

在上图,我们可以看到发送了很多的网络请求,这里我们点击第一个网络请求进行查看:


此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!

对其中几个比较重要的数据进行解释:


此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!

其他的数据类型暂时就不怎么重要了,都是固定写法,我们后面写代码的时候直接鞋子就可以了。到现在为止,我们就可以写一个简单的爬虫,去调用有道翻译的接口了。这里我们使用的网络请求库是Python3自带的urllib,相关代码如下:


此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
二、破解反爬虫机制:
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!

然后把格式化后的代码,复制下来,用sublime或者pycharm打开都可以,然后搜索salt,可以找到相关的代码:


此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!
此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!

知道salt和sign的生成原理后,我们就可以写Python代码,来对接他的接口了,以下是相关代码:


此Python破解反爬虫实例,曾帮助过我成长,你也会对它表示感谢!

          8 Python Machine Learning Algorithms You Must LEARN      Cache   Translate Page   Web Page Cache   
1. Objective

Previously, we discussed the techniquesof machine learning with python . Going deeper, today, we will talk about and implement 8 top Python Machine Learning Algorithms.

Let’s begin the journey of Machine Learning Algorithms in Python Programming.


8 Python Machine Learning Algorithms   You Must LEARN

8 Python Machine Learning Algorithms You Must LEARN

2. Python Machine Learning Algorithms

Followings are the Algorithms of Python Machine Learning:

a. Linear Regression

Linear regression is one of the supervised Python Machine learning algorithms that observes continuous features and predicts an outcome. Depending on whether it runs on a single variable or on many features, we can call it simple linear regression or multiple linear regression.

This is one of the most popular Python ML algorithms and often under-appreciated. It assigns optimal weights to variables to create a line ax+b to predict the output. We often use linear regression to estimate real values like a number of calls and costs of houses based on continuous variables. The regression lineis the best line that fits Y=a*X+b to denote a relationship between independent and dependent variables.

Let’s plot this for the diabetes dataset.

>>> import matplotlib.pyplot as plt >>> import numpy as np >>> from sklearn import datasets,linear_model >>> from sklearn.metrics import mean_squared_error,r2_score >>> diabetes=datasets.load_diabetes() >>> diabetes_X=diabetes.data[:,np.newaxis,2] >>> diabetes_X_train=diabetes_X[:-30] #splitting data into training and test sets >>> diabetes_X_test=diabetes_X[-30:] >>> diabetes_y_train=diabetes.target[:-30] #splitting targets into training and test sets >>> diabetes_y_test=diabetes.target[-30:] >>> regr=linear_model.LinearRegression() #Linear regression object >>> regr.fit(diabetes_X_train,diabetes_y_train) #Use training sets to train the model

LinearRegression(copy_X=True, fit_intercept=True, n_jobs=1, normalize=False)

>>> diabetes_y_pred=regr.predict(diabetes_X_test) #Make predictions >>> regr.coef_ array([941.43097333]) >>> mean_squared_error(diabetes_y_test,diabetes_y_pred)

3035.0601152912695

>>> r2_score(diabetes_y_test,diabetes_y_pred) #Variance score

0.410920728135835

>>> plt.scatter(diabetes_X_test,diabetes_y_test,color ='lavender')

<matplotlib.collections.PathCollection object at 0x0584FF70>

>>> plt.plot(diabetes_X_test,diabetes_y_pred,color='pink',linewidth=3) [<matplotlib.lines.Line2D object at 0x0584FF30>] >>> plt.xticks(()) ([], <a list of 0 Text xticklabel objects>) >>> plt.yticks(()) ([], <a list of 0 Text yticklabel objects>) >>> plt.show()
8 Python Machine Learning Algorithms   You Must LEARN

Python Machine LearningAlgorithm Linear Regression

b. Logistic Regression

Logistic regression is a supervised classification Python Machine Learning algorithms that finds its use in estimating discrete values like 0/1, yes/no, and true/false. This is based on a given set of independent variables. We use a logistic function to predict the probability of an event and this gives us an output between 0 and 1.

Although it says ‘regression’, this is actually a classification algorithm. Logistic regression fits data into a logit function and is also called logit regression . Let’s plot this.

>>> import numpy as np >>> import matplotlib.pyplot as plt >>> from sklearn import linear_model >>> xmin,xmax=-7,7 #Test set; straight line with Gaussian noise >>> n_samples=77 >>> np.random.seed(0) >>> x=np.random.normal(size=n_samples) >>> y=(x>0).astype(np.float) >>> x[x>0]*=3 >>> x+=.4*np.random.normal(size=n_samples) >>> x=x[:,np.newaxis] >>> clf=linear_model.LogisticRegression(C=1e4) #Classifier >>> clf.fit(x,y) >>> plt.figure(1,figsize=(3,4)) <Figure size 300x400 with 0 Axes> >>> plt.clf() >>> plt.scatter(x.ravel(),y,color='lavender',zorder=17)

<matplotlib.collections.PathCollection object at 0x057B0E10>

>>> x_test=np.linspace(-7,7,277) >>> def model(x): return 1/(1+np.exp(-x)) >>> loss=model(x_test*clf.coef_+clf.intercept_).ravel() >>> plt.plot(x_test,loss,color='pink',linewidth=2.5) [<matplotlib.lines.Line2D object at 0x057BA090>] >>> ols=linear_model.LinearRegression() >>> ols.fit(x,y)

LinearRegression(copy_X=True, fit_intercept=True, n_jobs=1, normalize=False)

>>> plt.plot(x_test,ols.coef_*x_test+ols.intercept_,linewidth=1) [<matplotlib.lines.Line2D object at 0x057BA0B0>] >>> plt.axhline(.4,color='.4')

<matplotlib.lines.Line2D object at 0x05860E70>

>>> plt.ylabel('y')

Text(0,0.5,’y’)

>>> plt.xlabel('x')

Text(0.5,0,’x’)

>>> plt.xticks(range(-7,7)) >>> plt.yticks([0,0.4,1]) >>> plt.ylim(-.25,1.25)

(-0.25, 1.25)

>>> plt.xlim(-4,10)

(-4, 10)

>>> plt.legend(('Logistic Regression','Linear Regression'),loc='lower right',fontsize='small')

<matplotlib.legend.Legend object at 0x057C89F0>

>>> plt.show()
8 Python Machine Learning Algorithms   You Must LEARN

Machine LearningAlgorithm Logistic Regreesion

c. Decision Tree

A decision tree falls under supervised Python Machine Learning learning and comes of use for both classification and regression- although mostly for classification. This model takes an instance, traverses the tree, and compares important features with a determined conditional statement. Whether it descends to the left child branch or the right depends on the result. Usually, more important features are closer to the root.

This Python Machine Learning algorithms can work on both categorical and continuous dependent variables. Here, we split a population into two or more homogeneous sets. Let’s see the algorithm for this-

>>> from sklearn.cross_validation import train_test_split >>> from sklearn.tree import DecisionTreeClassifier >>> from sklearn.metrics import accuracy_score >>> from sklearn.metrics import classification_report >>> def importdata(): #Importing data balance_data=pd.read_csv('https://archive.ics.uci.edu/ml/machine-learning-'+ 'databases/balance-scale/balance-scale.data', sep= ',', header = None) print(len(balance_data)) print(balance_data.shape) print(balance_data.head()) return balance_data >>> def splitdataset(balance_data): #Splitting data x=balance_data.values[:,1:5] y=balance_data.values[:,0] x_train,x_test,y_train,y_test=train_test_split( x,y,test_size=0.3,random_state=100) return x,y,x_train,x_test,y_train,y_test >>> def train_using_gini(x_train,x_test,y_train): #Training with giniIndex clf_gini = DecisionTreeClassifier(criterion =
          从零开始搭建Cuckoo Sandbox(3)      Cache   Translate Page   Web Page Cache   

上一篇文章结束后,我们算是把Cuckoo Sandbox搭建并调教好了。这一章我们主要来看看Cuckoo的行为签名。

行为签名可以说是直接决定了Cuckoo究竟可以标记出多少恶意行为甚至是直接标记出样本所属的恶意软件家族,但是在这里我们并不会讨论具体某个恶意软件家族的规则应该怎么写,我们只会提到几个编写及使用行为签名时需要注意的地方。

虽然根据Yara的文档: http://yara.readthedocs.io/en/latest/modules/cuckoo.html ,Yara 是有Cuckoo模块的,可以根据Cuckoo的行为记录创建检测规则,并且Cuckoo在处理的时候也可以匹配Yara规则,但是Cuckoo模块仅能利用上Cuckoo提供的一部分行为记录(文件/注册表/互斥体/网络),因此我个人更倾向于使用Cuckoo内置的行为签名。

先给出官方提供的行为签名的相关文档: https://cuckoo.readthedocs.io/en/latest/customization/signatures/ ,看完以后我只有一个感想:“这么少的么?”。另外查看源码发现,Signature类下面并没有“list_signatures”这个成员函数,因而在文档的尾部的那段代码是有问题的。

官方的文档都这样了,咱们也只能靠自己了。虽说是整个文章的最后一部分了,但是这个部分不会太长。

一个吐槽

首先必须得感谢Cuckoo的那一批社区签名库,真的是节省了我不少的精力,不需要从头开始,而且里面不少的签名写的非常棒,也不乏一些逻辑较为复杂的规则。

只不过,里面有不少的规则,有一些迷之问题,大家可以花一些时间好好看一看,应该会发现不少,比如正则/没写“regex=True”( rat_fynloski.py) ;另外还有一些并没有太大作用的规则,有一部分是在对咱们中国几大特产的检出上。

但是在这里,我们只提一个规则: creates_null_reg_entry.p y。这条规则的目的是用来检出一些首字节为0x00的注册表项(该手段常用于恶意软件隐藏注册表痕迹,详情可以去网上自行搜索)。如果你有留意处理行为记录所在的命令行窗口输出的各种信息,你会发现这个python脚本有时候会被python报错,而且是编码错误。

不多说了,直接上图。下图是我这边的解决方案:


从零开始搭建Cuckoo Sandbox(3)

行为签名匹配顺序

如果你编写的行为签名需要依赖其他的行为签名(比如a签名需要在b签名匹配的基础上才会进行匹配),那么你就需要考虑这个问题了。

Cuckoo的行为签名的执行顺序是有先后的,每个行为签名脚本内的 on_* 函数的执行时机也不同:on _call (Cuckoo处理每条API记录时执行),on_signature(某条签名匹配时执行),on_complete(每条行为签名最后执行的一个函数,大多数的决定该条行为签名是否匹配的代码都写在这儿)


从零开始搭建Cuckoo Sandbox(3)

如果你有留意Cuckoo在命令行输出的调试信息,可以很容易得知Cuckoo会按windows下文件名的排序规则来按次序读取规则。

只需要记住以下内容即可:行为签名的文件名是给你看的,行为签名的“name”属性也是给你看的,只有行为签名的类名是给Cuckoo Sandbox看的(Cuckoo会对类名按照之前提到的排序规则进行排序并逐个读取及匹配)。

如果你写的a签名在类名上是排在b签名前面,并且a签名需要在b签名已匹配的基础上进行匹配(即on_signature里判断被匹配签名的名称是否为b的名称,并且作为整个签名匹配的必要条件),那么a签名你这辈子都别想触发匹配了。

可用的函数

行为签名内可用的函数都可以在[ Python 安装目录 ]\Lib\site-packages\cuckoo\ common \ abstracts .py 里找到,在Signature类里面。

需要提及的一点是,我们经常会在行为签名里使用check_file这个函数用于判断程序是否对特定文件进行了操作。


从零开始搭建Cuckoo Sandbox(3)

如上图,可以留意到在actions这个参数为None时,Cuckoo为我们覆盖了6种情况。只不过这并没有覆盖全,此外还有file_created(文件创建)/file_recreated(重新创建文件)这两种文件操作。如果你是希望仅仅判断特定文件是否在记录里露脸,那么仅仅将actions置为None可能会出现漏网之鱼。如果你希望判断是否创建特定文件,那么只需要将actions置为“file_created”即可。

get _files函数也有同样的问题,使用的时候需要多注意一下。

Severity的值以及Score的计算

正如官方文档所说,severity的值 通常 为1-3,社区签名库里绝大多数签名的severity的值都落在这个范围(antivirus_virustotal .py 是个例外),实际上你可以跳出这个范围(只不过得是整数),也可以对现有签名的这个值进行针对性的调整。

Cuckoo对每个被分析的样本,在最后的分析报告展示界面都会给出一个评分,那么这个评分是怎么算的呢?答案在[ Python 安装目录 ]\Lib\site-packages\cuckoo\ core \ plugins .py 。
从零开始搭建Cuckoo Sandbox(3)

可以说是非常明显了。Cuckoo对每条被匹配签名的severity进行累加,最后除5.但是,如果解析出了恶意软件的配置信息,则会将评分锁定为10(Cuckoo似乎是实行了10分制【实际并没有进行封顶】,只不过我这边自己编写的签名加入后,不少的样本都超出了这个值,甚至部分样本到了20)

整篇文章到此结束,最后祝大家身(fen)体(xi)健(yu)康(kuai),我们下次再见。


          How many android patterns are there?      Cache   Translate Page   Web Page Cache   

This is a post I've had kicking around in my drafts folder for just over 2 years now so I've decided to publish it as a partly complete problem.

One of my favorite pastimes, when I'm bored, is solving the puzzles on Project Euler . I'm not very far through but I've solved 56 at the time of this writing. It's as much about writing the code and learning the language, in my case python, as it is about actually solving the problems.

The questions are good because they look like there must be a simple way to calculate the answer but it's not immediately obvious. An interesting question that I think is worthy of Project Euler is:

How many possible Android lock screen patterns are there? And how would you calculate it for arbitrarily sized grids?

Let's examine the standard size first, initially if we are just working with a 3 x 3 grid it might help to think of the positions as numbers from 1 to 9.


How many android patterns are there?

Initially, we might (incorrectly) think, there are 9 possible starting positions, then 8 remaining moves, then 7 and so on. So it would be 9!

>>> import math >>> math.factorial(9) 362880

But then we can't use a pattern of fewer than 3 positions so if we remove all there digit options

>>> math.factorial(9) - (9 * 8 * 7) 362376

However this is not right for a few reasons, first 9! (362880) is only the number of combinations of length 9 so to get all the possible combinations

>>> import itertools >>> positions = [1, 2, 3, 4, 5, 6, 7, 8, 9] >>> len(list(itertools.permutations(positions, 9))) # 362880 >>> len(list(itertools.permutations(positions, 8))) # 362880 >>> len(list(itertools.permutations(positions, 7))) # 181440 >>> len(list(itertools.permutations(positions, 6))) # 60480 >>> len(list(itertools.permutations(positions, 5))) # 15120 >>> len(list(itertools.permutations(positions, 4))) # 3024 # Total 985824

Second now we know how many combinations there are, we see that not all combinations are valid, for example while we can have 1234.


How many android patterns are there?

We can't have 1324 because there is no way to get from 1 to 3 without going through 2, even if you try to avoid it the line snaps to any positions it passes through.


How many android patterns are there?

I found a few incorrect solutions online which simply had a list of invalid moves such as from 1 to 3, from 7 to 9 and so on, but this is not correct either. We can't simply say that moving from 1 to 3 is always invalid because once a position has been used we can jump over it so we can have 2413 as a valid pattern which does go from 1 to 3.


How many android patterns are there?

This might be obvious, but just to clearly state it; While you can't jump over an unchecked position, you don't need to move to an adjacent position, for example, knights moves are valid, so we can have 1834


How many android patterns are there?

But just when we think we are getting a handle on things, LineageOS (previously CyanogenMod) throws a spanner in the works by allowing grids up to 6 x 6. For a larger grid, I think it's easier to switch to a coordinate system instead of numbered positions.


How many android patterns are there?
This brings in a whole new range of moves, for example [(0,3), (5,0), (2,5), (2,4), (2,3), (2,2), (2,1), (2,0), (5,5), (0,2)]
How many android patterns are there?
and it brings some new invalid moves, we can't go from [(0,0), (4,2)] without passing through (2,1)
How many android patterns are there?

After banging my head on a wall for a while, I searched online for a solution and the best answer I found was a 3 x 3 grid has 389112 possible patterns.

That's great, but every single correct solution I could find involved a brute force approach. Trying every possible combination and then discarding the invalid ones.

When it's just a simple 3 x 3 grid with only 985824 combinations to check brute force is not a bad way to go.

With a 4 x 4 grid (16 positions, over 4,000,000,000,000 combinations to check) brute force becomes incredibly hard but still within the realms of modern computers. By the time we get to 6 x 6 grids (36 positions, more than 2^128 combinations to check) it's downright impossible on current hardware.

There are some things we can do to speed things up though, for example the last two lengths (e.g. on the 3 x 3 grid that combinations of length 8 and 9) will always have the same number of possible combinations because every combination of 8 positions has exactly one corresponding combination of 9 positions.

So the problem that I haven't been able to crack is, can we design an efficient algorithm that can calculate the number of possible moves on an arbitrarily sized grid? not just square grids, what about 3 x 9 for example.

All pictures generated with Lock Pattern Generator

If you are a maths genius and you have a solution please get in touch. I'd love to know and I'll update this post with a link to your solution, michael at hybr dot id dot au


          想提高爬虫效率?aiohttp 了解下      Cache   Translate Page   Web Page Cache   

本文原创发布于微信公众号「极客猴」,欢迎关注第一时间获取更多原创分享

对于爬虫程序,我们往往会很关注其爬虫效率。影响爬虫效率有几个因素有,是否使用多线程,I/O 操作,是否同步执行等。其中 I/O 操作、同步执行是最影响爬虫效率的。

众所周知,Requests 库一个优秀的 HTTP 库,通过它可以非常简单地发起 HTTP 请求。不过,这个库所执行的网络请求都是同步。当爬虫程序进程获得 CPU 的时间片时,如果程序在进行 I/O 操作(例下载图片),在这段 IO 执行的时间里,CPU 处于空闲中,这样会造成 CPU 的计算能力就被浪费了。

如果 CPU 能将等待时间利用起来,那么爬虫效率就提高了。那就需要对程序进行改造,将 I/O 同步操作变成异步操作。本文内容是介绍一个强大的异步 I/O 操作的库 ―― aiohttp 。

1 aiohttp 介绍

说到 aiohttp ,不得不说下 asyncio 。asyncio 是 python 3.4 版本引入的标准库。它工作模式是单线程并发,使用协同执行 I/O 操作。asyncio 的编程模型就是一个消息循环。我们从 asyncio 模块中直接获取一个 EventLoop 的引用,然后把需要执行的协程扔到 EventLoop 中执行,就实现了异步 IO。

使用 asyncio 实现一个异步函数 hello() 的例子:

import asyncio @asyncio.coroutine # 修饰符,等同于 asyncio.coroutine(hello()) def hello(): print("Hello world!") # 异步调用asyncio.sleep(1): r = yield from asyncio.sleep(1) print("Hello again!") # 获取EventLoop: loop = asyncio.get_event_loop() # 执行coroutine loop.run_until_complete(hello()) loop.close() 复制代码

而 aiohttp 则是基于 asyncio 实现的 HTTP 框架。 aiohttp 全称是 Async http client/server framework。翻译成中文是异步 HTTP 的客户端/服务器框架。从名字中,我们可知 aiohttp 是分为服务器端和客户端,专门异步处理 HTTP 的请求。

2 aiohttp 安装

安装 aiohttp 可以通过 pip 方式安装,在终端中执行安装命令即可。

pip install aiohttp 复制代码 3 async/await 语法

前面我们讲到异步 I/O 的用法,但是声明异步函数比较繁琐,还需要依赖 yield 语法。在 Python 3.5 中,引入了 async/await 关键字,使得异步回调的写法更加直观和人性化。

在函数 def 之前增加关键字 async ,表示这个函数是异步函数。相当于替代语法 @asyncio.coroutine 。具体例子例如:

async def hello(): print("Hello World!") 复制代码

另外使用 await 替换了 yield from , 表示这部分操作为异步操作。

async def hello(): print("Hello World!") r = await asyncio.sleep(1) print("Hello again!") 复制代码

最后执行异步函数,还是需要用到 EventLoop 引用,然后利用协程执行异步函数。最终的代码如下:

import asyncio async def hello(): print("Hello world!") r = await asyncio.sleep(1) print("Hello again!") if __name__ == '__main__': loop = asyncio.get_event_loop() tasks = [hello(), ] loop.run_until_complete(asyncio.wait(tasks)) loop.close() 复制代码

运行结果如下:

Hello world! >> 会暂停一秒钟 Hello again! 复制代码 4 aiohttp 基本用法

我们使用 aiohttp 以 GET 方式向 httpbin.org 网站发起一个 HTTP 请求。因为是 aiohttp 是异步处理 HTTP 请求。所以还必须遵循 Python 的异步函数语法,即需使用 async/await 语法。

使用 aiohttp 发起一个 HTTP 请求,具体编写可以分为以下几步: 1)使用 async 定义异步函数 2)通过 aiohttp.ClientSession 获取一个 session 对象 3)用该 session 对象以 GET、POST、PUT 等方式去请求网页 4)最后获取 EventLoop 引用,执行异步函数。

import asyncio import aiohttp # 定义异步函数 main() async def main(): # 获取 session 对象 async with aiohttp.ClientSession() as session: # get 方式请求 httbin async with session.get('http://httpbin.org/get') as response: print(response.status) print(await response.text()) loop = asyncio.get_event_loop() loop.run_until_complete(main()) 复制代码

aiohttp 支持自定义 headers、设置超时时间、设置代理、自定义 cookie 等。

import asyncio import aiohttp url = 'http://httpbin.org/post' headers = { 'User-agent': "Mozilla/5.0 (windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36", } data = { 'data': 'person data', } # 定义异步函数 main() async def main(): # 获取 session 对象 async with aiohttp.ClientSession() as session: # post 方式请求 httbin async with session.post(url=url, headers=headers, data=data) as response: print(response.status) print(await response.text()) loop = asyncio.get_event_loop() loop.run_until_complete(main()) 复制代码

关于 aiohttp 更多用法,可以执行阅读官网文档。说句实话,aiohttp 跟 Requests 的用法大同小异。如果你已经学会了 Requests 库,很快就能掌握 aiohttp 的用法。


          一个 Pillow 实现的图像填充函数      Cache   Translate Page   Web Page Cache   

Pillow,即PIL,python Imaging Library,中文是“枕头”。Pillow是Python平台中图像处理的标准库,功能非常强大,API简单易用。

本文分享一个Pillow实现的图像填充函数 pad_image ,用于预处理图像数据集。在目标检测算法中,需要把输入图像,转换为,模型所需尺寸的图像,同时,保持比例不变,其余部分用灰色填充。

函数的具体实现,如下:

计算图像缩放之后的宽高,等比例缩小或扩大; 调用resize(),改变图像的尺寸; 新建new()目标尺寸target_size的图像; 调用paste()贴图,将缩放后的原始图像,放入目标图像中。

实现如下:

def pad_image(image, target_size): iw, ih = image.size # 原始图像的尺寸 w, h = target_size # 目标图像的尺寸 scale = min(float(w) / float(iw), float(h) / float(ih)) # 转换的最小比例 # 保证长或宽,至少一个符合目标图像的尺寸 nw = int(iw * scale) nh = int(ih * scale) image = image.resize((nw, nh), Image.BICUBIC) # 缩小图像 # image.show() new_image = Image.new('RGB', target_size, (128, 128, 128)) # 生成灰色图像 # // 为整数除法,计算图像的位置 new_image.paste(image, ((w - nw) // 2, (h - nh) // 2)) # 将图像填充为中间图像,两侧为灰色的样式 # new_image.show() return new_image 复制代码

测试:

def main(): img_path = 'xxxx.jpg' image = Image.open(img_path) size = (416, 416) pad_image(image, size) # 填充图像 if __name__ == '__main__': main() 复制代码

原图:


一个 Pillow 实现的图像填充函数

修改:


一个 Pillow 实现的图像填充函数

OK, that's all! Enjoy it!

更多算法技巧,关注微信公众号: 深度算法 (ID: DeepAlgorithm)


          开发者福音!谷歌云APP引擎今天开始支持Python 3.7      Cache   Translate Page   Web Page Cache   

开发者福音!谷歌云APP引擎今天开始支持Python 3.7
开发者福音!谷歌云APP引擎今天开始支持Python 3.7
新智元报道

来源:Google Cloud

作者:Stewart Reichling、Matthew Soldo

编译:三石

【新智元导读 】今天,Google Cloud宣布平台重大升级,推出第二代App Engine标准runtime,测试版支持python 3.7。开发者现在可以使用最新版本的语言、框架和库轻松运行Web应用程序。


开发者福音!谷歌云APP引擎今天开始支持Python 3.7

Python今天表示很开心!

谷歌云推出了第二代App Engine引擎标准runtime,这是对平台的重大升级,允许开发者使用最新版本流行语言、框架和库轻松运行web应用程序。想要尝鲜的小伙伴们,猛戳下方链接吧!

http://cloud.google.com/appengine/docs/standard/python3/

敲重点!―可支持便携式Web应用程序了!

App Engine的原始版本早于Google Cloud Platform,并且它支持现代Web App的丰富服务。 当App Engine首次推出时,云数据存储和Firebase身份验证等GCP服务尚不存在,因此我们为常见的Web应用功能构建了App Engine API,如存储,缓存和排队。 这有助于开发者以最少的设置编写应用程序,但同时也降低了代码的可移植性。


开发者福音!谷歌云APP引擎今天开始支持Python 3.7

新的Python 3.7 runtime支持Google Cloud客户端库,因此开发者可以将GCP服务集成到自己的应用程序中,并在App Engine,Compute Engine或任何其他平台上运行它。目前,在第二代runtime(包括Python 3.7)中,只有原始版App Engine的API还不可用。

为什么偏偏选中Python3.7?

Python 3.7是Cloud Next上宣布的新第二代runtime之一。基于轻量级容器运行时沙箱―gVisor的技术,这些第二代运行时消除了许多以前的App Engine限制,使开发者能够编写便携式Web应用程序和微服务,从而利用App引擎独特的自动缩放、内置安全性和按次计费模式。


开发者福音!谷歌云APP引擎今天开始支持Python 3.7

十多年来,开发人员选择App Engine作为其完全托管的开发人员体验,而Python一直是该体验的关键部分―谷歌于2005年 使用Python2.5.2推出了 App Engine。Python3.7 runtime的引入使谷歌能够了解语言社区的最新进展,并且作为第二代runtime,它可以更快地实现持续的runtime更新。

这个新的runtime允许开发者利用Python的开源库和框架生态系统。 虽然Python 2 runtime只允许使用特定版本的白名单库,但Python 3支持任意第三方库,包括那些依赖C代码和本机扩展的库。 只需将Django 2.0,NumPy,scikit-learn或您选择的库添加到requirements.txt文件中即可。 部署应用程序时,App Engine将在云中安装这些库。

新智元AI WORLD 2018大会【早鸟票】 开售!

新智元将于9月20日在北京国家会议中心举办AI WORLD 2018 大会,邀请机器学习教父、CMU教授 Tom Mitchell,迈克思泰格马克,周志华,陶大程,陈怡然等AI领袖一起关注机器智能与人类命运。

大会官网:

http://www.aiworld2018.com/

即日起到8月19日,新智元限量发售若干早鸟票,与全球AI领袖近距离交流,见证全球人工智能产业跨越发展。
开发者福音!谷歌云APP引擎今天开始支持Python 3.7
活动行购票链接:

http://www.huodongxing.com/event/6449053775000

活动行购票二维码:


开发者福音!谷歌云APP引擎今天开始支持Python 3.7
开发者福音!谷歌云APP引擎今天开始支持Python 3.7
开发者福音!谷歌云APP引擎今天开始支持Python 3.7

          Go语言版本的forgery      Cache   Translate Page   Web Page Cache   

使用过python语言的朋友们可能使用过 forgery_py ,它是一个伪造数据的工具。能伪造一些常用的数据。在我们开发过程和效果展示是十分有用。但是没有Go语言版本的,所以就动手折腾吧。

从源码入手

在forgery_py的 PyPi 有一段的实例代码:

>>> import forgery_py >>> forgery_py.address.street_address() u'4358 Shopko Junction' >>> forgery_py.basic.hex_color() '3F0A59' >>> forgery_py.currency.description() u'Slovenia Tolars' >>> forgery_py.date.date() datetime.date(2012, 7, 27) >>> forgery_py.internet.email_address() u'brian@zazio.mil' >>> forgery_py.lorem_ipsum.title() u'Pretium nam rhoncus ultrices!' >>> forgery_py.name.full_name() u'Mary Peters' >>> forgery_py.personal.language() u'Hungarian'

从以上的方法调用我们可以看出forgery_py下有一系列的 *.py 文件,里面有各种方法,实现各种功能,我们在来通过分析下Python版本的forgery_py的源码来看看它的实现原理。

# ForgeryPy 包的一级目录 ├── dictionaries # 伪造内容和来源目录,目录下存放的都是一些文本文件 ├── dictionaries_loader.py # 加载文件脚本 ├── forgery # 主目录,实现各种数据伪造功能,目录下存放的都是python文件 ├── __init__.py

我们在来看下forgery目录下的脚本

$ cat name.py import random from ..dictionaries_loader import get_dictionary __all__ = [ 'first_name', 'last_name', 'full_name', 'male_first_name', 'female_first_name', 'company_name', 'job_title', 'job_title_suffix', 'title', 'suffix', 'location', 'industry' ] def first_name(): """Random male of female first name.""" _dict = get_dictionary('male_first_names') _dict += get_dictionary('female_first_names') return random.choice(_dict).strip()

__all__ 设置能被调用的方法。

first_name() 方法是forgery_py中一个典型伪造数据方法,我们只要来分析它就可以知道forgery_py的工作原理了。

这个方法代码很少,能容易就看出 _dict = get_dictionary('male_first_names') 和 _dict += get_dictionary('female_first_names') 获取的数据合并,在最后的 return random.choice(_dict).strip() 返回随机的数据。它的重点在于 get_dictionary() ,所以我们需要来看它的所在位置 dictionaries_loader.py 文件。

$ cat dictionaries_loader import random DICTIONARIES_PATH = abspath(join(dirname(__file__), 'dictionaries')) dictionaries_cache = {} def get_dictionary(dict_name): """ Load a dictionary file ``dict_name`` (if it's not cached) and return its contents as an array of strings. """ global dictionaries_cache if dict_name not in dictionaries_cache: try: dictionary_file = codecs.open( join(DICTIONARIES_PATH, dict_name), 'r', 'utf-8' ) except IOError: None else: dictionaries_cache[dict_name] = dictionary_file.readlines() dictionary_file.close() return dictionaries_cache[dict_name]

以上就是 dictionaries_loader.py 文件去掉注释后的所以要内容。它的主要实现就是:定义一个全局的字典参数 dictionaries_cache 作为缓存,然后定义方法 get_dictionary() 获取源数据, get_dictionary() 中每次forgery目录底下方法调用时先查看缓存,缓存字典中存在数据就直接输出,不存在就读取 dictionaries 底下的对应文件,并存入缓存。最后是返回数据。

总的来说forgery_py的原理就是:一个方法调用,去读内存中的缓存,存在就直接返回,不存在就到对应的文本文件中读取并写入缓存并返回。返回来的数据再随机选取输出结果。

使用Go语言实现

在了解了forgery_py的工作原理之后,我们就可以来使用Go语言来实现了。

# forgery的基本目录 $ cat forgery ├── dictionaries # 数据源 │ ├── male_first_names ├── name.go # 具体功能实现 └── loader.go # 加载数据

根据python版本的我们也来创建对应的目录。

实现数据的读取的缓存:

// forgery/loader.go package forgery import ( "os" "io" "bufio" "math/rand" "time" "strings" ) // 全局的缓存map var dictionaries map[string][]string = make(map[string][]string) // 在获取数据之后随机输出 func random(slice []string) string { rand.Seed(time.Now().UnixNano()) n := rand.Intn(len(slice)) return strings.TrimSpace(slice[n]) } // 主要的数据加载方法 func loader(name string) (slice []string, err error) { slice, ok := dictionaries[name] // 缓存中存在数据,直接返回 if ok { return slice, nil } // 读取对应文件 file, err := os.Open("./dictionaries/" + name) if err != nil { return slice, err } defer file.Close() rd := bufio.NewReader(file) for { line, err := rd.ReadString('\n') slice = append(slice, line) if err != nil || io.EOF == err { break } } dictionaries[name] = slice return slice, nil } // 统一的错误处理 func checkErr(err error) (string, error) { return "", err }

实现具体的功能:

// forgery/name.go // Random male of female first name. func FirstName() (string, error) { slice, err := loader("male_first_names") checkErr(err) slice1, err := loader("female_first_names") checkErr(err) slice = append(slice, slice1...) return random(slice), nil }

这样就将python语言版本的forgery_py使用Go来实现了。

最后

上面只是提及了一些工作原理,具体的源代码可以看 https://github.com/xingyys/fo... ,也十分感谢 https://github.com/tomekwojci... ,具体的思路和里面的数据源都是他提供的。本人就是做了一些 翻译 的的工作。


          [译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了      Cache   Translate Page   Web Page Cache   

面向用户:对NLP感兴趣,想学习处理问题思路并通过实例代码练手

阅读时长:全文大约 2000 字,读完可能需要下面这首歌的时间

授人以鱼不如授人以渔,今天的文章由作者 Adam Geitgey授权在人工智能头条翻译发布。 不仅给出了具体代码,还一步步详细解析了实现原理和思路。正所谓有了思路,无论是做英语、汉语的语言处理,才算的上有了指导意义。

Adam Geitgey毕业于佐治亚理工学院,曾在团购网站Groupon担任软件工程师总监。目前是软件工程和机器学习顾问,课程作者,Linkedin Learning的合作讲师。

计算机是如何理解人类语言的?
[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

让机器理解人类语言,是一件非常困难的事情。计算机的专长在处理结构化数据,但人类语言是非常复杂的,碎片化,松散,甚至不合逻辑、心口不一。

既然直男不能明白为什么女朋友会生气,那计算机当然无法理解A叫B为 孙子 的时候,是在喊亲戚、骂街,或只是朋友间的玩笑。

面对人类,计算机相当于是 金刚陨石直男 。

正是由于人工智能技术的发展,不断让我们相信,计算机总有一天可以听懂人类表达,甚至像真人一样和人沟通。那么,就让我们开始这算美好的教程吧。

创建一个NLP Pipeline

London is the capital and most populous city of England and the United Kingdom. Standing on the River Thames in the south east of the island of Great Britain, London has been a major settlement for two millennia. It was founded by the Romans, who named it Londinium.

伦敦,是英国的首都,人口居全国之首。位于大不列颠岛东南方泰晤士河流域,在此后两个世纪内为这一地区最重要的定居点之一。它于公元50年由罗马人建立,取名为伦蒂尼恩。

-- 维基百科

Step 1:断句(句子切分)

上面介绍伦敦的一段话,可以切分成3个句子:

伦敦是大不列颠的首都,人口居全国之首(London is the capital and most populous city of England and the United Kingdom)

位于泰晤士河流域(Standing on the River Thames in the south east of the island of Great Britain, London has been a major settlement for two millennia)

它于公元50年由罗马人建立,取名为伦蒂尼恩(It was founded by the Romans, who named it Londinium)

Step 2:分词

由于中文的分词逻辑和英文有所不同,所以这里就直接使用原文了。接下来我们一句一句的处理。首先第一句:

“London”, “is”, “ the”, “capital”, “and”, “most”, “populous”, “city”, “of”, “England”, “and”, “the”, “United”, “Kingdom”, “.”

英文的分词相对简单一些,两个空格之间可以看做一个词(word),标点符号也有含义,所以把标点符号也看做一个词。

Step 3:区分单词的角色

我们需要区分出一个词在句子中的角色,是名词?动词?还是介词。我们使用一个预先经过几百万英文句子训练、被调教好的词性标注(POS: Part Of Speech)分类模型:


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

这里有一点一定要记住 :模型只是基于统计结果给词打上标签,它并不了解一个词的真实含义,这一点和人类对词语的理解方式是完全不同的。

处理结果:


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

可以看到。我们等到的信息中,名词有两个,分别是 伦敦 和 首都 。伦敦是个独特的名称,首都是个通用的称谓,因此我们就可以判断,这句话很可能是在围绕 伦敦 这个词说事儿。

Step 4: 文本词形还原

很多基于字母拼写的语言,像英语、法语、德语等,都会有一些词形的变化,比如单复数变化、时态变化等。比如:

I had a pony(我有过一匹矮马)

I have two ponies (我有两匹矮马)

其实两个句子的关键点都是 矮马pony 。Ponies和pony、had和have只是同一个词的不同词形,计算机因为并不知道其中的含义,所以在它眼里都是完全不一样的东西,

让计算机明白这个道理的过程,就叫做词形还原。对之前有关伦敦介绍的第一句话进行词形还原后,得到下图


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了
Step 5:识别停用词

停用词:在信息检索中,为节省存储空间和提高搜索效率,在处理自然语言数据(或文本)之前或之后会自动过滤掉某些字或词,这些字或词即被称为Stop Words(停用词)。这些停用词都是人工输入、非自动化生成的,生成后的停用词会形成一个停用词表。但是,并没有一个明确的停用词表能够适用于所有的工具。甚至有一些工具是明确地避免使用停用词来支持短语搜索的。

-- 维基百科

还是来看第一句话:


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

其中灰色的字,仅仅是起到衔接和辅助表述的作用。他们的存在,对计算机来说更多是噪音。所以我们需要把这些词识别出来。

正如维基所说,现在虽然停用词列表很多,但一定要根据实际情况进行配置。比如英语的 the ,通常情况是停用词,但很多乐队名字里有 the 这个词,The Doors, The Who,甚至有个乐队直接就叫The The!这个时候就不能看做是停用词了。

Step 6:解析依赖关系

解析句子中每个词之间的依赖关系,最终建立起一个关系依赖树。这个数的root是关键动词,从这个关键动词开始,把整个句子中的词都联系起来。


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

从这个关系树来看,主语是London,它和capital被be联系起来。然后计算机就知道,London is a capital。如此类推,我们的计算机就被训练的掌握越来越多的信息。

但因为人类语言的歧义性,这个模型依然无法适应所有场景。但是随着我们给他更多的训练,我们的NLP模型会不断提高准确性。Demo地址

https://explosion.ai/demos/displacy?utm_source=AiHl0

我们还可以选择把相关的词进行合并分组,例如把名词以及修饰它的形容词合并成一个词组短语。不过这一步工作不是必须要有的,视具体情况而定。


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了
Step 7:命名实体识别

经过以上的工作,接下来我们就可以直接使用现有的命名实体识别(NER: Named Entity Recognition)系统,来给名词打标签。比如我们可以把第一句话当中的地理名称识别出来:


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

大家也可以通过下面的链接,在线体验一下。随便复制粘贴一段英文,他会自动识别出里面包含哪些类别的名词:

https://explosion.ai/demos/displacy-ent?utm_source=AiHl0


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

Step 8:共指消解

人类的语言很复杂,但在使用过程中却是倾向于简化和省略的。比如他,它,这个,那个,前者,后者…这种指代的词,再比如缩写简称,北京大学通常称为北大,中华人民共和国通常就叫中国。这种现象,被称为共指现象。

在特定语境下人类可以毫不费力的区别出 它 这个字,到底指的是牛,还是手机。但是计算机需要通过共指消解才能知道下面这句话

它于公元50年由罗马人建立,取名为伦蒂尼恩

中的 它 ,指的是伦敦,而不是罗马,不是罗纹,更不是萝卜。


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

共指消解相对而言是我们此次创建NLP Pipeline所有环节中,最难的部分。

Coding

好了。思路终于讲完了。接下来就是Coding的部分。首先我们理一下思路


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

提示:上述步骤只是标准流程,实际工作中需要根据项目具体的需求和条件,合理安排顺序。

安装spaCy

我们默认你已经安装了python 3。如果没有的话,你知道该怎么做。接下来是安装spaCy:


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

安装好以后,使用下面代码


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

结果如下


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

GPE:地理位置、地名

FAC:设施、建筑

DATE:日期

NORP:国家、地区

PERSON:人名

我们看到,因为Londinium这个地名不够常见,所以spaCy就做了一个大胆的猜测,猜这可能是个人名。

我们接下来进一步,构建一个数据清理器。假设你拿到了一份全国酒店入住人员登记表,你想把里面的人名找出来替换掉,而不改动酒店名、地名等名词,可以这样做:


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了
把所有标注为[PERSON]的词都替换成REDACTED。最终结果
[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了
提取详细信息

利用spaCy识别并定位的名词,然后利用textacy就可以把一整篇文章的信息都提取出来。我们在wiki上复制整篇介绍伦敦的内容到以下代码


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

你会得到如下结果


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

我们获得了这么多有用的信息,就可以应用在很多场景下。比如,搜索结果的相关推荐:


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

我们可以通过下面这种方法实现上图的效果


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

因为公众号的限制,我们把代码做成了图片。如果你想看纯文本的代码,复制下面链接到浏览器打开

http://t.cn/RgCITGj?utm_source=AiHl0


[译] 用 Python 构建 NLP Pipeline,从思路到具体代码,这篇文章一次性都讲到了

          Python Programming Bible      Cache   Translate Page   Web Page Cache   

Python Programming Bible
Description

Featured on: Aug 9, 2018

python programming. Application programming. Semantics of general programming. The format of coding applications. All the functions of Python programming. Application programming. Gain a good understanding of the following concepts with this course: What Python is? How to program in the Python language Features of the Python programming language Coding semantics Website programming Design practises of applications Application programming GUI programming CGI programming Network programming Email programming XML programming
          Object-oriented Game of Life in Python      Cache   Translate Page   Web Page Cache   
Intro

I have been planning to build a superior GUI alternative to NetLogo for a long time. When I finally started working on a project, I decided to test the basic implementations in python. Sure enough, the first app was Conway’s Game of Life - a great special case of cellular automata .

Can you imagine my surprise that I haven’t found an object-oriented implementation of the game in Python ― all codes were either written in Java or in imperative Python.

So, I decided to build the model. I used to classes ― Person , which corresponded to each cell, and Game , which controlled the system dynamics. A great article by Giorgio Sironi also adviced to create a third class Generation , but I didn’t find it necessary.

Model

So, first, I imported three libraries that I used:

from math import ceil, floor, sqrt import random from matplotlib import pyplot as plt

Then I defined a Person class with (x,y) -coordinates and empty vector of alive-dead statuses for every time period. Every Person object contains neighbours method which returns adjacent cells and kill-resurrect methods to change the life status:

class Person: people = [] def __init__(self,x,y,alive): self.x = x self.y = y self.alive = alive Person.people.append(self) return def kill(self,t): self.alive[t] = False return def resurrect(self,t): self.alive[t] = True return def neighbours(self): a = [] people = Person.people a += [z for z in people if z.x == self.x and z.y == self.y+1] a += [z for z in people if z.x == self.x and z.y == self.y-1] a += [z for z in people if z.x == self.x-1 and z.y == self.y] a += [z for z in people if z.x == self.x-1 and z.y == self.y+1] a += [z for z in people if z.x == self.x-1 and z.y == self.y-1] a += [z for z in people if z.x == self.x+1 and z.y == self.y-1] a += [z for z in people if z.x == self.x+1 and z.y == self.y+1] a += [z for z in people if z.x == self.x+1 and z.y == self.y] return a def alive_neighbours(self,t): a = [z for z in self.neighbours() if z.alive[t]] return a

Then I defined a Game class with setup method which randomly draws initial stage, and stage method which executes system dynamics:

class Game: def __init__(self, n, m, t): self.n = n self.m = m self.t = t return def setup(self): for i in range(self.n): for j in range(self.m): a = [False]*self.t a[0] = random.choice([True, False]) Person(i,j,a) return def stage(self, t): for person in Person.people: if person.alive[t-1]: if len(person.alive_neighbours(t-1))<2: person.kill(t) if len(person.alive_neighbours(t-1)) in [2,3]: person.resurrect(t) if len(person.alive_neighbours(t-1))>3: person.kill(t) else: if len(person.alive_neighbours(t-1))==3: person.resurrect(t) return def play(self): self.setup() for i in range(1,self.t): self.stage(i) return def results(self): people = Person.people a = [[z.x, z.y, z.alive] for z in people] print(a) return a

So, this was it. Now, you just have to input number of rows and columns (x,y) and number of time periods t and play the game. The resulting plots will be exported to your working directory:

# game: a = Game(x,y,t) a.play() a.results() # plot: for j in range(t): b = [] for i in range(x): b.append([z[2][j] for z in a.results() if z[0]==i]) plt.spy(b) plt.savefig(f"time{j}.png")

For presentation purposes I animated the resulting graphs:


Object-oriented Game of Life in Python
"Game of life" dynamics Conclusion

Here we go, I have built and shared the very basic object-oriented implementation of the very basic game in the most popular language, and somehow I ended up being the first one to do it. So, I will just leave it here as a starting point for future learners.

Be sure to check my other posts in ravshansk.com/blog and to subscribe to my Twitter @ravshansk .


          Why can not I iterate on a Python counter&quest;      Cache   Translate Page   Web Page Cache   

Why is that when I try to do the below, I get the need more than 1 value to unpack ?

for key,value in countstr: print key,value for key,value in countstr: ValueError: need more than 1 value to unpack

However this works just fine:

for key,value in countstr.most_common(): print key,value

I don't understand, aren't countstr and countstr.most_common() equivalent?

EDIT: Thanks for the below answers, then I guess what I don't understand is: If countstr is a mapping what is countstr.most_common() ? -- I'm really new to python, sorry if I am missing something simple here.

No, they're not. Iterating over a mapping (be it a collections.Counter or a dict or ...) iterates only over the mapping's keys.

And there's another difference: iterating over the keys of a Counter delivers them in no defined order. The order returned by most_common() is defined (sorted in reverse order of value).


          Change a single item in the list to float or other transformation      Cache   Translate Page   Web Page Cache   

Let say I have this data:

[[u'alpha',u'0.1'],[u'bravo',u'0.2']]

What I want to achieve is to change the 2nd element for each item to float

Here is my code. It able to get the output that I want, but I'm looking for something simpler, and more generic. For example, the element might be 99th out of 100, or I want to change the first element to title case.

I'm exploring map and lambda but cannot see how to use it in this case.

#!/bin/env python data = [[u'alpha',u'0.1'],[u'bravo',u'0.2']] print data tgb=[] for item in data: rfv=[] for x,elem in enumerate(item): if x == 1: rfv.append(float(elem)) else: rfv.append(elem) tgb.append(rfv) print tgb

Output:

[[u'alpha', u'0.1'], [u'bravo', u'0.2']] [[u'alpha', 0.1], [u'bravo', 0.2]]

You could do the following:

def convert(x): try: return float(x) except ValueError, e: return x.title() data = [[u'alpha',u'0.1'], [u'bravo',u'0.2'], [u'charlie', u'0.1', u'0.2', u'0.3', u'0.4']] data = [[convert(element) for element in entry] for entry in data] print data

This will attempt to convert all items to floats, but leave them as strings if they cannot be converted in title format. This would display the following output:

[[u'Alpha', 0.1], [u'Bravo', 0.2], [u'Charlie', 0.1, 0.2, 0.3, 0.4]]

This would also work for 100 elements. Note, it would not be possible to use a lambda for the convert() function as it would need to be a single expression.

Alternatively, map() could be used to give the same results:

data = [map(convert, entry) for entry in data]
          Using Python and C together      Cache   Translate Page   Web Page Cache   

python and C, two of the most popular and greatest languages that have come out of programming. Many languages can communicate with each other very easily, but with Python and C it's a little tricky. But first lets start of with why we would want this?

There's no doubt about it that both of these languages are powerful, and incredibly useful. And it pays off sometimes to have the raw performance of the C language being used in a Python project, it can certainly help with procedures such as reducing response and processing times.

What we'll need

Python, and C. That's it.

The code

In this example I'll be using a simple Fibonacci function to demonstrate it all.

#include <Python.h> // create the function like you normally would in C int CFib(int n){ if(n < 2) return n; else return CFib(n - 1) + CFib(n - 2) } // this function will be binding our python version and our C version together // will only take one and only one non-keyword arguement static PyObject* fib(PyObject* self, PyObject* args) { int n; if(!PyArg_ParseTuple(args, "i", &n)) return NULL; return Py_BuildValue("i", CFib(n)) }

In the code we can see that we require the Python.h header file, this contains all the relevant methods, functions, attributes etc we need to allow the two languages to work together. We first start off by creating the function normally in C, then using the methods from the Python header file we create it again, but with a few more arguments.

As you can see we're using builders and parsers inside the function. These communicate between both languages to create the python versions of the function(s) in C.

Additionally we need a small setup script written in Python

from distutils.core import setup, Extension setup(name='ModuleName', version='1.0', ext_modules=[Extension('ModuleName', ['Fib.c'])])

most of it speaks for itself here. We're importing from a library that is built into Python 3+ that allows us to run setup and extension scripts, these both are compatible with C and the Python header file

In order to run this and save it as our own project we need to run these two commands

python setup.py build
python setup.py install

this will then allow you to call your module from any other python project.

And now for the grand finale

import ModuleName # really should've chosen a better name ModuleName.CFib(2)

prints out 1

This is available on my GitHub if you wish to fork it

Click me!

In addition this is a very basic version, you can follow up on how to go more indepth over at The Python tutorial site

Any errors or suggestions feel free to let me know <3


          VLOOKUP and SUMIF: Replicate in Python      Cache   Translate Page   Web Page Cache   

Often times, a new user to python will wish to replicate analysis previously done in Excel. Two major instances of this are the VLOOKUP and SUMIF commands.

VLOOKUP: Combining data through a common index SUMIF: Summing up values by category

Let’s take a look at how we can replicate these commands in Python. You will see that the pandas library offers quite a degree of flexibility when it comes to summarizing and wrangling data in Python in this way.


VLOOKUP and SUMIF: Replicate in Python

Firstly, we will import our libraries as standard, set our file path, and import the relevant datasets:

import pandas as pd import numpy as np import os; os.getcwd() path='/home/michaeljgrogan/Documents/a_documents/computing/data science/datasets' os.chdir(path) sales=pd.read_csv('sales.csv') sales customers=pd.read_csv('customers.csv') customers

Note that the datasets “customers.csv” and “sales.csv” are available on the “Datasets” page.

VLOOKUP with merge

In the sales dataset, you will notice that we have Date, ID and Sales figures. In the customers dataset, we also have a corresponding ID variable, but we do not have a date column.

Suppose that we wish to combine the “Date” variable in the sales dataset to the rest of the data in the customers dataset. Ordinarily, we would use VLOOKUP or INDEX-MATCH in Excel to do this. However, let us see how this can be done in Python:

#VLOOKUP sales.merge(customers, on='ID', how='right')

Once we have done this, you will see that we have the two datasets merged, with the Date variable on the left and the rest of the data on the right:

>>> sales.merge(customers, on='ID', how='right') Date ID Sales Age Country 0 2014-02-12 49 113769 23 Trinidad and Tobago 1 2014-02-14 57 122965 46 Singapore 2 2014-03-18 2 164556 28 Lao PDR .. ... .. ... ... ... 98 2016-12-06 32 126092 33 Saint Vincent and Grenadines 99 2016-12-27 45 117126 47 Iran, Islamic Republic of SUMIF with groupby

Often times, an Excel user will use the SUMIF function to sum up different values by category. This can also be replicated in Python.

Let’s firstly create a new dataframe in pandas:

df1 = pd.DataFrame({'names': ['John', 'Elizabeth', 'Michael', 'John', 'Elizabeth', 'Michael'], 'webvisits': ['24', '32', '40', '71', '65', '63'], 'minutesspent': ['20', '41', '5', '6', '48', '97']}, index=[0, 1, 2, 3, 4, 5])

Essentially, what we wish to do here is group by name, and get the total “minutesspent” for each person.

#SUMIF df1.groupby("names")["minutesspent"].sum()

Now, we have our results:

names Elizabeth 4148 John 206 Michael 597 Name: minutesspent, dtype: object Further Reading Dataquest: Pandas Cheat Sheet Python for Data Science Data Cleaning and Wrangling in R

Thanks for viewing this short tutorial, and please leave any questions in the comments below!


          Python Community Interview With Mike Driscoll      Cache   Translate Page   Web Page Cache   

Welcome to the first in a series of interviews with members of the python community.

If you don’t already know me, my name is Ricky, and I’m theCommunity Manager here at Real Python. I’m a relatively new developer, and I’ve been part of the Python community since January, 2017, when I first learned Python.

Prior to that, I mainly dabbled in other languages (C++, php, and C#) for fun. It was only after I fell in love with Python that I decided to become a “serious” developer. When I’m not working on Real Python projects, I make websites for local businesses.

This week, I’m talking to Mike Driscoll of Mouse Vs Python fame. As a long-time Python advocate and teacher, Mike shares his story of how he came to be a Python developer and an author. He also shares his plans for the future, as well as insight into how he would use a time machine…

Let’s get started.

Ricky: I’d like to start by learning how you got into programming, and how you came to love Python?


Python Community Interview With Mike Driscoll

Mike:I decided to be some kind of computer programmer when I went to college. I started out in computer science and then somehow ended up with an MIS degree due to some confusing advice I received long ago from a professor. Anyway, this was back right before the internet bubble burst, so there were no jobs in tech when I graduated. After working as the sole member of an I.T. team at an auction house, I was hired by the local government to be a software developer.

The boss at that place loved Python, and I was required to learn it because that was what all new development would be done in. Trial by fire! It was a stressful couple of months of turning Kixtart code into Python code for our login scripts. I also was challenged to find a way to create desktop user interfaces in Python so we could migrate away from these truly awful VBA applications that were created on top of MS Office.

Between my boss loving Python and me having so much fun learning it and using it on the job, I ended up loving it too. We made GUIs with wxPython, reports with ReportLab, web applications with TurboGears, and much more with just vanilla Python.

Ricky: You’ve been writing on your blog, Mouse Vs Python, for over 10 years now. How have you kept so consistent and motivated to write each week?

Mike:I’m not always consistent. There have been some gaps where I didn’t write much at all. There was a year where I had stopped writing for the most part for several months. But I noticed that my readership had actually grown while I was taking a break. I actually found that really motivating because there were so many people reading old posts, and I wanted my blog to continue to stay fresh.

Also, my readers have always been pretty supportive of my blog. Because of their support, I have been committed to writing on the blog whenever I can or at least jot down some ideas for later.

Ricky: You’ve also authored five books to date, with Python Interviews: Discussions with Python Experts being released earlier this year. Having spoken with so many highly prominent developers in the Python community, what tips or wisdom have you personally taken away from the book that have helped you develop (either professionally or personally)?

Mike:I really enjoyed speaking with the developers while working on the Python Interviews book. They were quite helpful in fleshing out the history of Python and PyCon USA as well as the Python Software Foundation.

I learned about where some of the core developers think Python might go in the future and also why it was designed the way it was in the past. For example, I hadn’t realized that the reason Python didn’t have Unicode support built-in at the beginning was that Python actually pre-dates Unicode by several months.

I think one of the lessons learned is how big data science and education are for Python right now. A lot of people I interviewed talked about those topics, and it was fun to see Python’s reach continue to grow.

Ricky: I’ve noticed you’ve started creating YouTube videos again for your Python 101 series. What made you decide to start creating video content again?

Mike:The Python 101 screencast was something I put together as an offshoot of the Python 101 book. While a lot of publishers say that video content is growing in popularity, my experience has been the opposite. My screencast series never had a lot of takers, so I decided to just share it with my readers on YouTube. I will be posting most or all of the series there and probably discontinue it as a product that I sell.

I think I need more experience creating video training, so I also plan to do more videos on other topics in Python and see how they are received. It’s always fun to try out other methods of engagement with my audience.

Ricky: Not only do you do so much for the online community, but you also founded and run your local Python user group. What advice would you give to someone (like me) who might be looking to go to their first local user group meeting?

Mike:Pyowa, the local Python group that I founded, now has several organizers, which is really nice. But back to your question. If you want to go to a group, the first thing to do is to find out where and if one exists near you. Most groups are listed on the Python wiki .

Next, you need to look up their website or Meetup and see what their next meeting is about. Most of the meetings I have been to in Iowa have some form of social time at the beginning, or end, or both. Then they have a talk of some sort or some other activity like mob programming or lightning talks. The main thing is to come prepared to talk and learn about Python. Most of the time, you will find that the local user groups are just as welcoming as the people who attend PyCon are.

Ricky: If you could go back in time, what would you change about Python? Is there something you wish the language could do? Or maybe there’s something you’d like to remove from the language, instead?

Mike:I wish Guido had been able to convince Google’s Android engineering department to include Python as one of the languages used natively in Android. As it is, we currently don’t have much in the way of writing applications for mobile besides Toga and Kivy. I think both of these libraries are pretty neat, but Toga is still pretty beta, especially on Android, and Kivy doesn’t look native on anything that it runs on.

Ricky: I love celebrating the wins in life, big and small. What has been your proudest Python moment so far?

Mike:Personally, I am proud of writing about Python in book and blog form and having so many readers who have found my ramblings helpful. I am also proud to know so many great people in the community who will help each other in many meaningful ways. It’s like having a network of friends that you haven’t even necessarily met. I find this unique to the Python community.

Ricky: I’m curious to know what other hobbies and interests you have, aside from Python? Any you’d like to share and/or plug?

Mike:Most of my spare time is spent playing with my three-year-old daughter. However, I also enjoy photography. It can be challenging to get the shot you want, but digital photography also makes it a lot easier since you can get instant feedback and adjust if you messed it up, assuming your subject is willing.

If you’d like to follow Mike’s blog or check out any of his books, head over to his website . You can also message Mike to say “Hi” on Twitter and YouTube .

Is there someone you’d like us to interview in the community? Leave their name below, and they just might be next.


          Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!      Cache   Translate Page   Web Page Cache   

Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

俗语有云,酷暑三伏钓早晚。那么啤酒配上小龙虾,岂不甚妙哉?本文是一篇吃货指南,给广大的小龙虾爱好者扒一扒,哪里的小龙虾才是一绝。


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

题图来源:金渡广告摄影


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
获得数据

本次数据我们爬取了大众点评中所有打上小龙虾标签的餐厅。


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

从上图中可以看出,我们可以获得餐厅的人均消费、点评数量、推荐菜、评分(口味、环境、服务)等信息,用于我们之后的分析。我们此次总共爬取到了225个城市,6758个餐厅,121.3万条评论。

我们截取其中的部分核心代码:

def find_city_page(path):

data = pd.read_excel(path)

city_lobster_page = pd.DataFrame()

driver = webdriver.Chrome()

for i in range(0,len(data)):

try:

js='window.open("'+data['city_lobster_url'][i]+'")'

driver.execute_script(js)

bsObj = BeautifulSoup(driver.page_source,'html.parser')

bs = bsObj.find_all('a',attrs={'class':'PageLink'})

this_city_lobster={'city_name':data['city_name'][i], 'page_num':max([int(l.text ) for l in bs])}

city_lobster_page = city_lobster_page.append(this_city_lobster,ignore_index=True)

except:

continue

return city_lobster_page


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
城市对比

首先要进行分析的是各个城市的小龙虾热度。我们以带有“小龙虾”标签的餐厅评论总和作为最终的对比依据,得到的TOP20城市如下:


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

可以看出 上海市 的点评数遥遥领先,可能存在以下两个因素:a.上海市的小龙虾餐厅数量较多,本身存在较大的消费群体;b.大众点评总部在上海,上海的商户入驻数量较多。有兴趣的朋友可以进行更深一步的研究。

圈定了TOP20城市后,我们首先看一下TOP20城市小龙虾的人均消费:


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

该项统计中,包邮区占据了靠前的位置,体现出来包邮区对小龙虾的热情和自身的消费水平。同时可以看到株洲的人均消费接近于上海的一半,有机会到湖南旅游的朋友可以考虑到株洲品尝物美价廉的小龙虾。

紧接着要看的是TOP20城市味道、环境、服务三部分的分数情况:


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

我们发现服务分与环境分排序相同,二者具有极强的相关性,符合通常认知。同时可以看到在三项分数中,北方的四个城市天津、西安、北京、青岛各项指标均处于靠前的位置,其中天津的服务和环境均处于首位。

结合下图全国小龙虾热力图,似乎有些有悖于大家的认知。


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

由此我们可以得出在小龙虾整体热度比较强的区域,人们对于小龙虾各方面的要求会相应提高,相反在整体热度偏低区域,人们评价时会相对宽容。同时我们看到 海口 的各项指标均处于最后一位,需要进行相应的调整。


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
探索龙虾

我们看过了各个城市的情况后,进一步看一下小龙虾本身的一些有趣的内容,首先看一下龙虾的口味,我们选取了各个餐厅中带有龙虾的推荐菜,分词后获得TOP20的口味:


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

十三香、蒜蓉、麻辣高居前三位,根据笔者的经验,这基本上是符合大家整体口味的选择。TOP20中的蛋黄、白灼对于笔者而言相对陌生,有品尝过的朋友可以分享一些这些口味的体验。

看完了口味,再看一下龙虾的好丽友:


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

螺丝、花甲、毛豆位高居TOP3,看来大家吃龙虾的时候,会希望同时选择一些不用摘手套就可以享用的食物,毕竟吃的过程中频繁摘手套会比较费劲。


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
龙虾画像

目前互联网公司中非常普遍地会进行一些人群画像的分析,我们在这里借用一下这个概念,也为小龙虾绘制一副专属的画像。下面展示的两幅图分别是词云图和模板原图:


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

部分词云绘制代码如下:

# 解析小龙虾图片

back_color = imread('小龙虾.jpg') # 解析该图片

# 参数配置

wc = WordCloud(background_color='white', # 背景颜色

max_words=300, # 最大词数

mask=back_color, # 以该参数值作图绘制词云,这个参数不为空时,width和height会被忽略

max_font_size=100, # 显示字体的最大值

font_path="C:/windows/Fonts/simhei.ttf", # 解决显示口字型乱码问题,可进入C:/Windows/Fonts/目录更换字体

random_state=4, # 为每个词返回一个PIL颜色

#width=2000, # 图片的宽

#height=1860 #图片的长

)

# 通过encounter计数器生成词云

wc.generate_from_frequencies(word_counts)

# 基于彩色图像生成相应彩色

image_colors = ImageColorGenerator(back_color)

# 绘制词云

plt.figure()

plt.imshow(wc.recolor(color_func=image_colors))

plt.axis('off')


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
特(hei)色(an)龙虾

文章最后我们放上几个之前分词发现的特色口味龙虾,或许下一个网红龙虾就在其中。


Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!
Python 爬取了 121.3 万条大众点评,告诉你哪里的小龙虾才是一绝!

这些小龙虾的口味是你喜欢的吗?

作者:徐麟,知乎同名专栏作者,目前就职于上海唯品会产品技术中心,哥大统计数据狗,从事数据挖掘&分析工作,喜欢用R&python玩一些不一样的数据。个人公众号:数据森麟(ID:shujusenlin)。

声明:本文为作者投稿,版权归对方所有。

“征稿啦”

CSDN公众号秉持着「与千万技术人共成长」理念,不仅以「极客头条」、「畅言」栏目在第一时间以技术人的独特视角描述技术人关心的行业焦点事件,更有「技术头条」专栏,深度解读行业内的热门技术与场景应用,让所有的开发者紧跟技术潮流,保持警醒的技术嗅觉,对行业趋势、技术有更为全面的认知。

如果你有优质的文章,或是行业热点事件、技术趋势的真知灼见,或是深度的应用实践、场景方案等的新见解,欢迎联系CSDN投稿, 联系方式:微信(guorui_1118,请备注投稿+姓名+公司职位),邮箱(guorui@csdn.net)。


          The “Over-65 Vote” May Be “Funkier” That National Expects.      Cache   Translate Page   Web Page Cache   
We Didn't Die Before We Grew Old: It is sobering to realise that by 2020 roughly half of the Baby Boom Generation will be drawing a pension. The “Over-65 Vote” will no longer be composed overwhelmingly of what Colin James dubbed “The RSA Generation”. More and more of these older voters will cherish youthful memories of sex, drugs and rock-n-roll.

PICTURE THIS. It’s a just a few weeks before the 2020 general election and social media is smoking. A superb piece of digital fakery has the National Party leader, Simon Bridges, inhaling enthusiastically. Over a Pink Floydesque soundtrack, Bridges exhales an impressive cloud of marijuana smoke. “My party is opposed to legalising pot” he explains, grinning broadly and winking knowingly. “But, if the people of New Zealand vote yes to dope in the forthcoming referendum, then a new National Party government will honour their decision and end cannabis prohibition within its first 100 days.” The clip ends with a rather glassy-eyed Bridges flashing his viewers the peace sign. The video’s tag line flashes up on the screen: Simon says, VOTE YES – AND NATIONAL.

Now, the prospect of a “funky” National Party mobilising the “Head Vote” will no doubt  strike many readers as a most unlikely proposition. For a start, the staunchly conservative Mr Bridges would certainly not take kindly to being portrayed as some sort of peace, love and mungbeans hippie. Less certain, however, is whether his campaign team would be all that bothered by such a clever piece of guerrilla advertising. Not all fake news is bad news.

It is, similarly, important to realise that by 2020 roughly half of the Baby Boom Generation will be drawing a pension. The “Over-65 Vote” will no longer be composed overwhelmingly of what Colin James dubbed “The RSA Generation”.

More and more of these older voters will cherish youthful memories of sex, drugs and rock-n-roll.

On a darker note, their personal experience will have confirmed over and over again the brute reality that alcohol is capable of inflicting immeasurably more harm on families, friends and workmates than cannabis sativa.

Their children will point out the absurdity of preserving the market for increasingly deadly iterations of synthetic cannabis by prohibiting the cultivation and use of the real thing – a substance with no known fatalities to its credit.

The idea that the careers of their grandchildren may be jeopardised by engaging in what is, essentially, a harmless habit, will fill them with a mixture of exasperation and dread.

What’s more, as the Baby Boomers’ bodies begin to fail them and the aches and pains of old age make themselves known with ever-increasing intensity, the analgesic and stress-relieving qualities of cannabis will recommend themselves with ever-increasing force. Why should the law be interested in the consumption of a slice of hashish-infused chocolate-cake to relieve arthritis?

These are the considerations that National’s campaign strategists will be inviting Simon Bridges and his conservative colleagues to consider. Active Christian worship is now very much a minority sport. Likewise the misogyny and homophobia of those involuntarily celibate keyboard warriors who daily defile the Internet. The overwhelming majority of New Zealanders are men and women of good will and good humour. Those responsible for developing National’s election manifesto would do well to remember that.

Good will and good humour does not, however, signal soft-headedness. Sixty-five years and more on this earth has a habit of exposing the weaknesses of youthful propositions concerning human nature. Monty Python mercilessly satirised the notion that all individual failings could be laid at the door of “Society” by offering to “book them too”.

The explanation for the rock-solid character of National’s massive electoral support owes a great deal to older New Zealanders’ reluctant acceptance that many of the wounds which their less fortunate fellow citizens are expecting them to heal have almost certainly been self-inflicted. For the past forty years, doubt has been growing steadily in “Middle New Zealand” about the Welfare State’s capacity to improve the lives of either its “clients” or the society in which they live.

Bill English recognised this growing doubt and attempted to address it by means of his “Social Investment” initiatives. Much more work on these is required before they are ready to be rolled-out as the replacement for the First Labour Government’s “Social Security” model. There is, however, the whiff of the future about English’s ideas, so, if Simon Bridges is as wise as he is ambitious, then social investment will be the project into which he and his caucus colleagues hurl themselves in the run-up to 2020.

Bridges simple message to Middle New Zealand could be: “National’s not hard-hearted – just clear-headed”.

Except, of course, when it’s stoned.

This essay was originally published in The Otago Daily Times and The Greymouth Star of Friday, 3 August 2018.

          Linuxのdigコマンドやnslookupコマンドで、先頭がハイフンで始まっているFQDNを入...      Cache   Translate Page   Web Page Cache   
Linuxのdigコマンドやnslookupコマンドで、先頭がハイフンで始まっているFQDNを入れるとオプションと解釈されてしまいます。 たとえば「-ddsdfa23a2sol.ddns.com」(架空)などです。 ハイフンが付いているためオプションと認識されてしまい、使い方はヘルプを見てねと表示されてしまいます。 これをどうにかするにはどうしたらよいでしょうか。 ドメイン名を「"」で囲んだり、前に「--」をつけたりしてみましたが、digやnslookupコマンドでは効き目がないようでやはりオプションとして認識されてしまいます。 名前解決だけならPythonで...
          Adding MAVLINK_DATA_STREAM bricks px4 on boot-up      Cache   Translate Page   Web Page Cache   

So the critical thing is that the GCS know about the message.

In SITL, for example, MAVProxy needs to know about your new message type.
The way I typically accomplish this is to reinstall pymavlink (which
MAVProxy is based on):

pbarker@bluebottle:~/rc/pymavlink(master)$
MDEF=$HOME/rc/ardupilot/modules/mavlink/message_definitions python
setup.py build install --user --force
running build
running build_py
Using message definitions from
/home/pbarker/rc/ardupilot/modules/mavlink/message_definitions
Building
/home/pbarker/rc/ardupilot/modules/mavlink/message_definitions/v1.0/ualberta.xml
for protocol 1.0
.
.

After that, your tlog should contain your new message.


          I need a python programer      Cache   Translate Page   Web Page Cache   
I need a python programmer to code a simple python (Budget: $10 - $30 USD, Jobs: Python)
          Lead Data Scientist (Python, R, SQL, Redshift)      Cache   Translate Page   Web Page Cache   
Anson McCade - The City, London - Lead Data Scientist (Python, R, SQL, Redshift) London Currently recruiting for a highly publicised start up Fin Tech organisation... is looking for experienced Lead Data Scientists (Python, R, SQL, Redshift) to help with essential projects aiding to the growth of both the product and the business...
          Data Scientist (Python, R, SQL, Redshift)      Cache   Translate Page   Web Page Cache   
Anson McCade - The City, London - Data Scientist (Python, R, SQL, Redshift) London Currently recruiting for a highly publicised start up Fin Tech organisation... is looking for experienced Data Scientists (Python, R, SQL, Redshift) to help with essential projects aiding to the growth of both the product and the business...
          Github 上最火的 Google 图像下载工具使用说明:google-images-download      Cache   Translate Page   Web Page Cache   

批量下载图片这种事儿,不管是谁可能都要遇上那么几回。作为一名信息管理与信息系统讲师,我经常需要大量图像来训练模型,这个需求也就更强烈。

一般来说,搜图时我们都会在 Google 图片中搜索我们想要的东西。比如键入「Walle」,就能搜到皮克斯制作的动画《瓦力》的主人公:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

搜索的结果,合乎我们的要求。Google 不但给了咱们图片,而且标记也已经做好了,下一步只要批量下载就行了。

可就是在批量下载这一步,很多人犯了难。和各位一样,我第一个想法也是看看有没有什么现成的 Chrome 插件能批量存图,结果却发现并没有一款从准确度和操作上都足够理想的工具。

山重水复的时候,我发现了一个特别棒的 Github 项目,叫做 google-images-download

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

项目发布至今,只有短短 5 个月的时间,星标数量已经上了 2000,看来确实非常受欢迎。

google-images-download 是个 Python 脚本。但**使用它却不需要什么代码知识——一条命令,就完成 Google 图片搜索和批量下载功能。而且,这工具还跨平台运行,Linux, Windows 和 macOS 都支持。简直是懒人福音。

安装

google-images-download 安装很简单。以 macOS 为例,只需要在终端下,执行以下命令:

pip install google_images_download

安装就算完成了。当然,这需要你系统里已经安装了 Python 环境。如果你还没有安装,或者对终端操作命令不太熟悉,可以参考我的《如何安装 Python 运行环境 Anaconda?(视频教程)》一文,学习如何下载安装 Anaconda ,和进行终端命令行操作。

运行

这次我们尝试下载什么图片呢?想起《我不是药神》里面有个叫谭卓的女演员,演的不错。可是我一开始,把她当成郝蕾了。咱们就尝试下载一些谭卓的图片吧。

电影《我不是药神》海报
电影《我不是药神》海报

首先,我们先指定图片要下载的位置,我把它指定到了「下载(Download)」这个文件夹:

cd ~/Downloads

然后,在终端里执行即可:

googleimagesdownload -k "谭卓" -l 20

这行代码中:

  • googleimagesdownload 是命令名,告诉系统我们现在要执行什么命令,现在我们要执行的就是「googleimagesdownload」这个命令。
  • -k 指的是「关键词(Keyword)」,所以它的后面紧跟着关键词,在这里是 "谭卓",注意关键词要用半角直双引号框起来。
  • -l 指的是「限定(limit)」,指定下载图片的数量。本例中,我们下载了 20 张。

下面是执行过程:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

最后的 Error: 1 说明, 下载过程中,发生了一个错误。但程序依然正常地将下载流程运行完毕。我们来看结果:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

我们发现,下载的图片已经都存放在 ~/Downloads/downloads/谭卓 下面。google-images-download 非常贴心地,为我们建立子目录。

基本上,这一行命令就能帮我们解决正常情况下,批量下载图片的需求了。

进阶

然而,在有的情况下,我们需要下载的图片远远大于 20 张。比如说我看了半天照片,还是分不大清楚郝蕾和谭卓。那么为了彻底分清两位女演员,我打算再下载 200 张郝蕾的照片试试。

仿照刚才的命令,执行:

googleimagesdownload -k "郝蕾" -l 200

然后,你会发现报错了:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

遇到问题,不要慌。你得认真看看错误提示。注意其中出现了一个关键词:chromedriver。这是个什么东西呢?

我们回到 google-images-downloadgithub 页面,以 chromedriver 为关键词进行检索。你会立即找到如下结果:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

原来,当我们下载的图片数量超过 100 张时,程序就必须调用 Seleniumchromedriver 才行。不知道它俩是啥无所谓,要了咱装就行了。

Selenium 在我们安装 google-images-download 的时候,就已经同时安装好了。现在我们只需要下载 chromedriver 即可,它的下载链接在 这里

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

下载时请注意根据自己的系统类型选择合适的版本:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

我这里选的是 macOS 版本。下载后,压缩包里面只有一个文件,把它解压,然后放在 ~/Downloads 目录下。

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

接下来我们就可以批量下载超过 100 张图片了。执行以下命令:

googleimagesdownload -k "郝蕾" -l 200 --chromedriver="./chromedriver"

我们会发现多了一个 参数 --chromedriver。它是用来告诉 google-images-download 解压后 chromedriver 的所在路径。这回机器勤勤恳恳,帮我们下载郝蕾的照片了:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

下载完毕后发现也有一些报错,部分图片没有正确下载。但这对总体结果没有太大影响。为了保险起见,建议你设置下载数量时,多设置一些。给自己留出安全边际嘛。

最后我们再打开下载后的目录 ~/Downloads/downloads/郝蕾 看看:

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

这回,你能分清楚她俩不?

更多参数介绍

评分这么高的 google-images-download ,自然不可能只有上例中这两三个参数选项。如果你对它感兴趣,可以用 这个链接查看全部可用参数列表。

null#source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000

我数了一下,一共有39项。篇幅所限,这里就不一一展开罗列了。但是其中几个特色参数,我还是希望提示你一下,因为你在实际工作中,很可能会觉得它们有用处。

  • --format: 选择图片格式,例如 jpg, png, gifsvg 等;
  • --usage_rights:选择图片版权,例如 labeled-for-nocommercial-reuse 等。如果你希望建立自己发布内容用的图片素材库,可以用这个选项,避免踩到版权的坑上,被人家狮子大开口要钱;
  • --size:选择图片大小。假如说你对于图片分辨率有要求,可以用 >10MP ,只下载像素数量超过 10M 的那些图片;
  • --type:选择图片类型。例如只想要照片,可以用 photo ,只想要动漫形象,可以用 animated
  • --time:选择图片被检索的时间。假如想要过去一周的图片,可以使用 past-7-days
  • --specific_site:指定图片存储网站。可以将搜索结果,限定在某个网站域名范围内;

最后还有一个参数,是 ---safe_search,它的作用是启用安全搜索,来保证搜索结果中,不会出现不利于精神文明建设的内容。


          PYTHON APPLICATIONS DEVELOPER - Givex - Toronto, ON      Cache   Translate Page   Web Page Cache   
We are seeking technically oriented application developers who are passionate about coding and relentless in the pursuit of excellence. Daily responsibilities...
From Givex - Fri, 03 Aug 2018 07:39:22 GMT - View all Toronto, ON jobs
          WEB PROGRAMMER ANALYST II      Cache   Translate Page   Web Page Cache   
PA-Philadelphia, Title : Web Programmer Analyst(Python) Duration : 6 Months Contract Location : Philadelphia, PA Job Description: Strong understanding of Python Development (2 - 5 years experience is a MUST) and the Django Framework. Experience with PHP development or a similar style such as Microsoft ASP. In-depth understanding of Relational Database Management software, preferably PostgreSQL, MySQL, and MS SQL S
          mmotm 2018-08-09-20-10 uploaded      Cache   Translate Page   Web Page Cache   
akpm@linux-fo ... writes: (Summary) This mmotm tree contains the following patches against 4.18-rc8: (patches marked "*" will be included in linux-next) origin.patch i-need-old-gcc.patch * maintainers-gdb-update-e-mail-address.patch * lib-ubsan-remove-null-pointer-checks.patch * mm-bugfix-check-return-value-of-ioremap_prot.patch * zram-remove-bd_cap_synchronous_io-with-writeback-feature.patch * zram-remove-bd_cap_synchronous_io-with-writeback-feature-v2.patch * zram-remove-bd_cap_synchronous_io-with-writeback-feature-v2-checkpatch-fixes.patch * arm-arch-arm-include-asm-pageh-needs-personalityh.patch * dax-remove-vm_mixedmap-for-fsdax-and-device-dax.patch * prctl-add-pr_et_pdeathsig_proc.patch * firewire-use-64-bit-time_t-based-interfaces.patch * ufs-use-ktime_get_real_seconds-for-sb-and-cg-timestamps.patch * ntfs-use-timespec64-directly-for-timestamp-conversion.patch * hpfs-extend-gmt_to_local-conversion-to-64-bit-times.patch * spdxcheck-work-with-current-head-licenses-directory.patch * scripts-add-python-3-compatibility-to-spdxcheckpy.
          Full Stack Engineer - Workbridge Associates - San Clara, MB      Cache   Translate Page   Web Page Cache   
30% Python/ Java. Strong expertise in Python, Java. The ultimate goal is to give life insurance coverage for Health-Conscious groups such as training athletes,... $120,000 - $160,000 a year
From Workbridge Associates - Thu, 02 Aug 2018 01:23:52 GMT - View all San Clara, MB jobs
          Software Developer - Varian Medical Systems - Winnipeg, MB      Cache   Translate Page   Web Page Cache   
Java, JavaScript/TypeScript, Angular, Python. Specialization in Java or other open source Web Application stack....
From Varian Medical Systems - Fri, 03 Aug 2018 06:07:59 GMT - View all Winnipeg, MB jobs
          Desarrollador php      Cache   Translate Page   Web Page Cache   
Pienza meeti g de colombia sas - Bogotá DC - Desarrollar aplicaciones con conocimientos avanzados de: PHP, MySQL, HTML, Javascript, AJAX, XML ? Administrar sistemas/servidores:Linux/Windows/Apache/IIS ? Desarrollar aplicativos Web en plataformas tales como PHP, Python, Wordpress, Moodle que sean responsive web design. ? Con...
          8/10/2018: /KINO: TERRY GILLIAM: PECHOWIEC W HAWAJSKIEJ KOSZULI      Cache   Translate Page   Web Page Cache   

POD PATRONATEM Terry Gilliam i słowo „zwyczajny” nigdy nie szli ze sobą w parze. Współtwórca Monty Pythona John Cleese powiedział o nim kiedyś, że nawet na tle reszty zespołu wydaje się wyjątkowo zaburzony (Gilliam twierdzi, że musiał mówić o sobie)....
          Senior Data Analyst - William E. Wecker Associates, Inc. - Jackson, WY      Cache   Translate Page   Web Page Cache   
Experience in data analysis and strong computer skills (we use SAS, Stata, R and S-Plus, Python, Perl, Mathematica, and other scientific packages, and standard...
From William E. Wecker Associates, Inc. - Sat, 23 Jun 2018 06:13:20 GMT - View all Jackson, WY jobs
          Hiring Solution Architect with Python & Machine Learning - RR Donnelley India Outsource Private Limited - Chennai, Tamil Nadu      Cache   Translate Page   Web Page Cache   
Graduates / Post graduates in Mathematics/Statistics/Data science / Actuarial science or any other degree that is considered suitable to perform the required...
From Monster IN - Tue, 07 Aug 2018 14:34:59 GMT - View all Chennai, Tamil Nadu jobs
          awscli (1.15.75)      Cache   Translate Page   Web Page Cache   
The AWS CLI is an open source tool built on top of the AWS SDK for Python (Boto) that provides commands for interacting with AWS services.

          mrworkserver added to PyPI      Cache   Translate Page   Web Page Cache   
A python work server written in C
          netpyne-py3 added to PyPI      Cache   Translate Page   Web Page Cache   
A Python package to develop, simulate and analyse biological neuronal networks in NEURON.
          azure-mgmt-kusto added to PyPI      Cache   Translate Page   Web Page Cache   
Microsoft Azure Kusto Management Client Library for Python
          dd-aliyun-python-sdk-cdn added to PyPI      Cache   Translate Page   Web Page Cache   
The cdn module of Aliyun Python sdk.
          laylib added to PyPI      Cache   Translate Page   Web Page Cache   
A 2-D game engine for Python
          pymanifold added to PyPI      Cache   Translate Page   Web Page Cache   
Python-Manifold is a Python implementation of Derek Rayside's Manifold microfluidic simulation tool
          How Do You Start In The Tech Sector?      Cache   Translate Page   Web Page Cache   

How Do You Start In The Tech Sector?
Career

August 9th, 2018

The tech sector, if you know what you're doing, is easier than most fields to get started in. However, you do have to know what you're doing. In this post, I'm going to step through a series of ways to get started, in case you're not sure.

Sounds easy, right? Well, nothing worthwhile's easy. Now, to be fair, I don't mean " if you know what you're doing " in any patronising or condescending way.

What I mean is that, unlike say being a GP , dentist , civil engineer , corporate lawyer , Queen's Council (QC) , etc., you don't need to have years of formal training.

What’s more, you don’t need to be registered with an industry group/board before you're allowed to work. These can include the Institute of Chartered Accountants , the Queensland Law Society , or the Queensland Bar Association .

In IT, however, most people whom I've spoken to over the years care far more for what you can do, rather than what a piece of paper says you could do.

Let's Say You Want to Write Code
How Do You Start In The Tech Sector?

If you want to write code, then start by learning the basics of a software development language. I'm not going to get into a flame war about one language or another, whether one's better than another or not.

That's for people with too much time on their hands, and for people who are too emotionally invested in their language(s) of choice ― or dare I say, just a bit insecure.

There are a host of languages to choose from, readily available on the three major operating systems ( linux , macOS , and windows ). Some of the most common, where you'll find the most amount of help and documentation, are php , Perl , C/C++ , Java , Go , Ruby , python , Haskell , and Lisp . Grab yourself and editor, or an IDE, learn it inside out, and get started learning to write code.

I've linked to a host of excellent online resources for each at the end of the article.

For my part, I prefer any language borne out of C/C++. I've written code in Visual Basic and Cobol and didn't come away from either experience positively.

Once you've learned the basics, start contributing to an open source project! You don't need to be overly ambitious, so the project doesn't need to be a big one.

It could be a small library, such as VIM for Technical Writers that I maintain every so often. It could, however, be the Linux Kernel too, if that's your motivation and you are feeling particularly ambitious.

Regardless of what you choose, by contributing to these projects you'll learn far faster and better than you likely could in any other way. Why?

Because you're working on real projects and have the opportunity to be mentored by people who have years of hands-on experience. You'll get practical, guided experience, the kind you'd likely take years to acquire on your own.

They'll help teach you good habits, best practices, patterns, techniques, and so much more; things you'd likely take ages to hear about, let alone learn.

What's more, you'll become part of a living, breathing community where ― hopefully ― you're encouraged to grow and appreciate the responsibilities and requirements of what it takes to ship software.

But I'd Rather Be a Systems Administrator?
How Do You Start In The Tech Sector?

The same approach can be broadly applied. Here’s my suggestion. Install a copy of Linux , BSD , or Microsoft Windows on an old PC or laptop. As you're installing it, have a look around at the tools that are available for it

hint:open source provides a staggering amount of choice. #justsayin .

Get to know how it's administered, whether via GUI tools (and the Power Shell) on Windows, or via the various daemons and their configuration files and command-line tools on Linux and BSD.

Server administration's a pretty broad topic, so it's hard ― if not downright impossible ― to suggest a specific set of tools to learn. I'm encouraging you at this point to get a broad understanding.

Later, if you're keen, you can specialise in a particular area. However, for now, get a broad understanding of:

Networking User and Group Management Installation Options and Tooling Service/Daemon configuration; and Disk Management.

Whether you're on Linux, BSD, or Windows, I've linked to a host of resources at the bottom of the article to help get you started.

Now that you've learned the fundamentals do something where people can critique you and hold you accountable, such as hosting a website of your own, through a provider such as Digital Ocean , or Linode .

The web server you use, whether Apache , NGINX , Lighttpd , or IIS doesn't matter. Just use one that works well on your OS of choice.

Once you've got it up and running, start building on the day to day tasks required to keep it up and running nicely. Once you've grown some confidence, move on to learning how to improve the site's security and performance, and deployment process.

This can include:

Optimising the web server, filesystem, and operating system configuration setting for maximum throughput Setting up an intrusion detection system (IDS); and Dockerising your site To Go Open Source or Microsoft?

By now you've got a pretty good set of knowledge. However, stop for just a moment, because it's time to figure out if you're going to specialise in open source (Linux/UNIX/BSD) or whether you're going to focus around Microsoft's tools and technologies.

You can become knowledgeable in both, and most developers and systems administrators that I know do have a broad range of knowledge in both. However, I'd suggest that it's easier to build your knowledge in one rather than attempting to learn both.

Depending on the operating system you've been using up until now, it's likely that you've already made your choice. However, it's good to stop and deliberately think about it.

What Do You Do Next?

Now, let's get back to building your skills. What do you do next? If you want to be a sys admin, start look around for opportunities to help others with their hosting needs.

Don't go all in ― yet . There's no need to rush. Keep stepping up gradually , building your confidence and skills.

If you're not sure of who might need help, have a think about:

What clubs are you involved in? Do you have friends with small businesses that might need support? Do you know others who want to learn what you have and need a mentor? I'm sure that as you start thinking, you'll be able to uncover other ideas and possibilities. Now you have to get out of your comfort zone, contact people and ask them if they need help.

Worst case scenario, they say no. Whatever! Keep going until you find someone who does want help and is willing to take you on.

Regardless of the path that you take, you should feel pretty confident in your foundational skills, because they're based on practical experience.

So, it's time to push further. To do that, I'd suggest contacting a University, a bank, or an insurance provider, if you want to cut your teeth on big installations.

Sure, many other places have big server installations. However, these three are the first that come to mind.

If you are focused on software development, here are a few suggestions:

Contact software development companies (avoid " digital agencies ") and see if they’re hiring. Talk to your local chamber of commerce and industry and let them know you’re around and what you do. Find the local business networking groups and go to the networking breakfasts. Get involved in your local user groups (this goes for sys admins too, btw). Start a user group if there isn’t one for what you want to focus on. In Conclusion

I could go on and on. The key takeaway I'm trying to leave you with is that, if you have practical experience , you'll increase the likelihood of gaining employment.

Any employer I've had of any worth values hands-on experience over a piece of paper any day.

Don't get me wrong; there's nothing wrong with degrees or industry certifications. And for complete transparency:

I have a Bachelor of Information Technology I'm LPIC-1 certified; and I’m a Zend (PHP 5) Engineer

However, university qualifications and industry certifications should only reinforce what you already know, and not be something that is used to get your start.

With all that said, I want to encourage you to go down the Open Source path, not Microsoft. But I’m biased, as I’ve been using Linux since 1999.

Regardless, have a chew on all that, and let me know what you think in the comments. I hope that, if you’re keen to get into IT, that this helps you do so, and clears up one or more questions and doubts that you may have.

Further Reading Open Source
          C++ Python Software Developers (PySide2 Qt Mission Systems)      Cache   Translate Page   Web Page Cache   
VA-Dulles, C+ Python Software Developers (PySide2 Qt Mission Systems) Aerospace Dulles, VA (6 mos open contract) *OOP C+ Python PySide2 GUI Graphic User Interface Qt JavaScript JQuery Angular HTML5 CSS3 Node.js REST RESTful APIs Make CMake Visual Studio OpenSceneGraph SQL DoD Aerospace Spacecraft Ground Systems Mission Systems UAVs* Please send your updated resume to: Bob.Russ@InSourceTechnical.com ASAP befo
          ERP on Odoo platform      Cache   Translate Page   Web Page Cache   
We are importing containers of beef from overseas to China. And distribute to restaurants and retailers. We need to build ERP on Odoo platform. Control orders, stocks, finance and other issues. (Budget: $1500 - $3000 USD, Jobs: ERP, PHP, Python, SAP, Software Architecture)
          STEM Scratchers      Cache   Translate Page   Web Page Cache   
looking for Scratchers those interested in teaching kids (Budget: $15 - $25 USD, Jobs: Coding, Photoshop, Programming, Python, Website Design)
          Offer - SAP TM TRAINING ONLINE INDIA - USA      Cache   Translate Page   Web Page Cache   
SAP TM TRAINING ONLINE INDIASOFTNSOL is a Global Interactive Learning company started by proven industry experts with an aim to provide Quality Training in the latest IT Technologies. SOFTNSOL offers SAP TM Online Training. Our trainers are highly talented and have Excellent Teaching skills. They are well experienced trainers in their relative field. Online training is your one stop & Best solution to learn SAP TM Online Training at your home with flexible Timings.We offer SAP TM Online Trainings conducted on Normal training and fast track training classes.SAP TM ONLINE TRAINING We offer you :1. Interactive Learning at Learners convenience time2. Industry Savvy Trainers3. Learn Right from Your Place4. Advanced Course Curriculum 5. 24/7 system access6. Two Months Server Access along with the training 7. Support after Training8. Certification Guidance We have a third coming online batch on SAP TM Online Training.We also provide online trainings on SAP ABAP,SAP WebDynpro ABAP,SAP ABAP ON HANA,SAP Workflow,SAP HR ABAP,SAP OO ABAP,SAP BOBI, SAP BW,SAP BODS,SAP HANA,SAP HANA Admin, SAP S4HANA, SAP BW ON HANA, SAP S4HANA,SAP S4HANA Simple Finance,SAP S4HANA Simple Logistics,SAP ABAP on S4HANA,SAP Success Factors,SAP Hybris,SAP FIORI,SAP UI5,SAP Basis,SAP BPC,SAP Security with GRC,SAP PI,SAP C4C,SAP CRM Technical,SAP FICO,SAP SD,SAP MM,SAP CRM Functional,SAP HR,SAP WM,SAP EWM,SAP EWM on HANA,SAP APO,SAP SNC,SAP TM,SAP GTS,SAP SRM,SAP Vistex,SAP MDG,SAP PP,SAP PM,SAP QM,SAP PS,SAP IS Utilities,SAP IS Oil and Gas,SAP EHS,SAP Ariba,SAP CPM,SAP IBP,SAP C4C,SAP PLM,SAP IDM,SAP PMR,SAP Hybris,SAP PPM,SAP RAR,SAP MDG,SAP Funds Management,SAP TRM,SAP MII,SAP ATTP,SAP GST,SAP TRM,SAP FSCM,Oracle,Oracle Apps SCM,Oracle DBA,Oracle RAC DBA,Oracle Exadata,Oracle HFM,Informatica,Testing Tools,MSBI,Hadoop,devops,Data Science,AWS Admin,Python, and Salesforce .Experience the Quality of our Online Training. For Free Demo Please ContactSOFTNSOL : India: +91 9573428933USA : +1 929-268-1172WhatsApp: +91 9573428933Skype id : softnsoltrainingsEmail id: info@softnsol.comWebsite : http://softnsol.com/.
          Java Internship Program - Evolet Technologies - R. T. Nagar, Bengaluru, Karnataka      Cache   Translate Page   Web Page Cache   
Learn Digital / PHP / Python / Dot Net / Android / Analytics. Excited for Summer Internship?... ₹5,000 - ₹5,500 a month
From Indeed - Tue, 17 Jul 2018 11:57:45 GMT - View all R. T. Nagar, Bengaluru, Karnataka jobs
          Linux and DevOps engineer - Cricbuzz - Bengaluru, Karnataka      Cache   Translate Page   Web Page Cache   
Knowledge of scripting languages like Perl, Bash, Python etc. Alongside its extremely popular website and mobile site, Cricbuzz's mobile applications for...
From Cricbuzz - Thu, 09 Aug 2018 12:20:21 GMT - View all Bengaluru, Karnataka jobs
          Développeur Python - Alteo Recrutement Informatique - Montréal, QC      Cache   Translate Page   Web Page Cache   
Alteo est à la recherche d'un Développeur Python pour un emploi permanent basé à Montréal (centre-ville). DEC / BAC en informatique, ingénierie logicielle ou l...
From Alteo Recrutement Informatique - Tue, 07 Aug 2018 00:48:42 GMT - View all Montréal, QC jobs
          ERP on Odoo platform      Cache   Translate Page   Web Page Cache   
We are importing containers of beef from overseas to China. And distribute to restaurants and retailers. We need to build ERP on Odoo platform. Control orders, stocks, finance and other issues. (Budget: $1500 - $3000 USD, Jobs: ERP, PHP, Python, SAP, Software Architecture)
          STEM Scratchers      Cache   Translate Page   Web Page Cache   
looking for Scratchers those interested in teaching kids (Budget: $15 - $25 USD, Jobs: Coding, Photoshop, Programming, Python, Website Design)
          ciclo for – python (parabola concava hacia abajo)      Cache   Translate Page   Web Page Cache   
El problema planteado consiste en crear un programa que permita generar valores consecutivos de “X” entre 2 y 8 sobre un plano cartesiano y luego pinte puntos sobre la gráfica de una ecuación parabólica cóncava hacia abajo dada como [ Y = -(X^2) + 10X – 20 ] El siguiente programa en python hace uso … Seguir leyendo ciclo for – python (parabola concava hacia abajo)
          How to search text using Regular Expression in Python      Cache   Translate Page   Web Page Cache   

Regular expression is one of the important topic in any programming language. Using regular expression, we can create pattern by using that we can search and capture text from data. This is the first part of my regular expression section, checkout the full course here: https://bit.ly/2v7lT74 বাংলা ভার্সন 

The post How to search text using Regular Expression in Python appeared first on Thinkdiff.net.


          Comment on من الصفر الى الاحتراف: Python دورة مكثفة للغة برمجة البايثون by Abo Hams      Cache   Translate Page   Web Page Cache   
i need thys thy Coupon
          Programmer/Analyst (Research Data Management) - University of Saskatchewan - Saskatoon, SK      Cache   Translate Page   Web Page Cache   
Java, JavaScript, Python, PHP, HTML, YAML, CSS, Git, Angular, Ansible, Grunt, Jenkins, JIRA, Confluence, Docker, Django.... $62,850 - $98,205 a year
From University of Saskatchewan - Mon, 30 Jul 2018 18:22:24 GMT - View all Saskatoon, SK jobs
          Senior Embedded Software Developer - SED Systems - Saskatoon, SK      Cache   Translate Page   Web Page Cache   
Familiarity with Matlab, Python, JavaScript, Java, HTML5; The ability to obtain a Secret security clearance and meet the eligibility requirements outlined in...
From SED Systems - Sat, 30 Jun 2018 07:14:09 GMT - View all Saskatoon, SK jobs
          Test Leader - hexatier - Leader, SK      Cache   Translate Page   Web Page Cache   
Development experience with Python and Java. Experience working with cross-functional teams including engineering, support and senior management is required....
From hexatier - Fri, 20 Jul 2018 09:43:27 GMT - View all Leader, SK jobs
          Python Developer - MJDP Resources, LLC - Radnor, PA      Cache   Translate Page   Web Page Cache   
Assemble large, complex data sets that meet business requirements and power machine learning algorithms. EC2, Lambda, ECS, S3.... $30 - $40 an hour
From Indeed - Wed, 13 Jun 2018 13:41:07 GMT - View all Radnor, PA jobs
          Data Engineer - PYTHON - MJDP Resources, LLC - Devon, PA      Cache   Translate Page   Web Page Cache   
Assemble large, complex data sets that meet business requirements and power machine learning algorithms. EC2, Lambda, ECS, S3.... $100,000 - $120,000 a year
From Indeed - Tue, 31 Jul 2018 14:44:04 GMT - View all Devon, PA jobs
          Around The World In One Hour      Cache   Translate Page   Web Page Cache   
Video: Around The World In One Hour
Watch This Video!
Studio: Global Video Pro
WORLD MONTAGE, One minute of various images from around the world. DEAD SEA, ISRAEL, Float on the Dead Sea. Eight times more salt than the ocean. Visitors from worldwide come to seek wellness from the water and healing black mud.
SNAKE CHARMER OF MALAYSIA, A dying breed, these snake charmers risk their lives to entertain audiences. frequently bitten by cobras and pit vipers, they still play a dangerous game! Have you had a 22 foot long python coiled around your body lately???
DIVE PHILIPPINES, the Philippines is known for its spectacular dive sites. Explore the beautiful undersea world around Cebu Island, teeming with a vast array of exotic sea creatures, caves and cliffs.... LAS VEGAS PREVIEW, tour of Las Vegas, aerials, casinos and Hoover Dam, etc.
HAWAII KAYAK ADVENTURE, paddle through the Big Islands ten tunnels high in the Kohala Mountains, by kayak. Some tunnels one mile long. The ultimate eco-tourism adventure!
ELEPHANT SHOW, THAILAND, see elephants perform amazing feats in Phuket, Thailand. Dancing, playing music, tricks, headstands, playing soccer and carrying boy with his head in the elephant's mouth. Daring stuff!

          Full-stack python developer needed.      Cache   Translate Page   Web Page Cache   
I need a assistant senior python developer and he should be qualified following techs.. 1. Python - 5+ years 2. Django - 5+ years 3. Scrapy, Selenium 4. Celery and Flower 5. Ubuntu, Debian and Debian 6... (Budget: $2 - $8 USD, Jobs: Amazon Web Services, Django, Elasticsearch, NoSQL Couch & Mongo, Python)
          Database Administrator Junior (W2, Mountain View) - cPrime, Inc. - Mountain View, CA      Cache   Translate Page   Web Page Cache   
Mountain View (W2) Top 2-3 skills you look for when reviewing resume- Oracle experience (3-5 YOE) AWS/RWS is a plus Demonstrated Python experience 3-5 YOE...
From Dice - Sat, 21 Jul 2018 02:03:31 GMT - View all Mountain View, CA jobs
          Coding Raspberry Pi & Python Learn Coding Easily      Cache   Translate Page   Web Page Cache   



Coding: Raspberry Pi & Python: Learn Coding Easily Kindle Edition
by Kevin Spencer

English | 2018 | ISBN: 172047558X | 112 Pages | EPUB | 365 KB[/c...

          Distributed Tasks Demystified with Celery, SQS & Python      Cache   Translate Page   Web Page Cache   
Distributed Tasks Demystified with Celery, SQS & Python

Distributed Tasks Demystified with Celery, SQS & Python
MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 4.5 Hours | Lec: 38 | 1.29 GB
Genre: eLearning | Language: English

          Python XML, JSON, and the Web      Cache   Translate Page   Web Page Cache   
Python XML, JSON, and the Web

Python: XML, JSON, and the Web
MP4 | Video: 720p | Duration: 1:38:54 | English | Subtitles: VTT | 278.4 MB

          Software Architecture With Python 1st Edition      Cache   Translate Page   Web Page Cache   
Anand Balachandran Pillai / Programming / 2017
          Developer - West, Inc. - Cheyenne, WY      Cache   Translate Page   Web Page Cache   
Leveraging .net framework, Java, Python, etc. C# is. Cheyenne or Laramie, WY....
From West, Inc. - Tue, 19 Jun 2018 10:23:47 GMT - View all Cheyenne, WY jobs
          IT Manager - Infrastructure - DISH Network - Cheyenne, WY      Cache   Translate Page   Web Page Cache   
Scripting experience in one or more languages (Python, Perl, Java, Shell). DISH is a Fortune 200 company with more than $15 billion in annual revenue that...
From DISH - Sun, 15 Jul 2018 05:30:30 GMT - View all Cheyenne, WY jobs
          Jr-Mid Level Software Engineer - IDEMIA - Morgantown, WV      Cache   Translate Page   Web Page Cache   
Knowledge or interest in multiple technology domains and languages e.g. Java, JavaScript, Go, Python, etc. As a software engineer for IDEMIA NSS, the successful...
From IDEMIA - Sun, 05 Aug 2018 08:52:20 GMT - View all Morgantown, WV jobs
          Electrical Engineer - 4D Tech Solutions, Inc. - Morgantown, WV      Cache   Translate Page   Web Page Cache   
Proficient in C, C++, Python, Java, and/or shell script. 4D Tech Solutions is seeking a highly motivated entry-level software/electrical/test engineer to join... $85,000 - $105,000 a year
From Indeed - Tue, 26 Jun 2018 16:33:55 GMT - View all Morgantown, WV jobs
          Around The World In One Hour      Cache   Translate Page   Web Page Cache   
Video: Around The World In One Hour
Watch This Video!
Studio: Global Video Pro
WORLD MONTAGE, One minute of various images from around the world. DEAD SEA, ISRAEL, Float on the Dead Sea. Eight times more salt than the ocean. Visitors from worldwide come to seek wellness from the water and healing black mud.
SNAKE CHARMER OF MALAYSIA, A dying breed, these snake charmers risk their lives to entertain audiences. frequently bitten by cobras and pit vipers, they still play a dangerous game! Have you had a 22 foot long python coiled around your body lately???
DIVE PHILIPPINES, the Philippines is known for its spectacular dive sites. Explore the beautiful undersea world around Cebu Island, teeming with a vast array of exotic sea creatures, caves and cliffs.... LAS VEGAS PREVIEW, tour of Las Vegas, aerials, casinos and Hoover Dam, etc.
HAWAII KAYAK ADVENTURE, paddle through the Big Islands ten tunnels high in the Kohala Mountains, by kayak. Some tunnels one mile long. The ultimate eco-tourism adventure!
ELEPHANT SHOW, THAILAND, see elephants perform amazing feats in Phuket, Thailand. Dancing, playing music, tricks, headstands, playing soccer and carrying boy with his head in the elephant's mouth. Daring stuff!

          Create custom Sphinx LaTeX output based on simple specified design.      Cache   Translate Page   Web Page Cache   
Need help creating the custom Sphinx LaTeX output that's generated with Sphinx's xelatex or latexpdf commands. The PDF format I want isn't complex, but I'm unfamiliar with LaTeX and how to configure it to apply styles to the Sphinx generated documentation... (Budget: $10 - $30 USD, Jobs: LaTeX, PDF, Python, Sphinx)
          Vagas De Programador Php (Júnior) – Aracaju – Se      Cache   Translate Page   Web Page Cache   
Programador PHP (júnior) – Aracaju – SE Requisitos: Necessário possuir conhecimento em programação em PHP e banco de dados MySQL. Framework Laravel, Git, Python será um diferencial. Regime de trabalho PJ.
          Programmer/Analyst (Research Data Management) - University of Saskatchewan - Saskatoon, SK      Cache   Translate Page   Web Page Cache   
Java, JavaScript, Python, PHP, HTML, YAML, CSS, Git, Angular, Ansible, Grunt, Jenkins, JIRA, Confluence, Docker, Django.... $62,850 - $98,205 a year
From University of Saskatchewan - Mon, 30 Jul 2018 18:22:24 GMT - View all Saskatoon, SK jobs
          Senior Embedded Software Developer - SED Systems - Saskatoon, SK      Cache   Translate Page   Web Page Cache   
Familiarity with Matlab, Python, JavaScript, Java, HTML5; The ability to obtain a Secret security clearance and meet the eligibility requirements outlined in...
From SED Systems - Sat, 30 Jun 2018 07:14:09 GMT - View all Saskatoon, SK jobs
          Test Leader - hexatier - Leader, SK      Cache   Translate Page   Web Page Cache   
Development experience with Python and Java. Experience working with cross-functional teams including engineering, support and senior management is required....
From hexatier - Fri, 20 Jul 2018 09:43:27 GMT - View all Leader, SK jobs
          Hopper Disassembler 4.3.27- - Binary disassembler, decompiler, and debugger. (Demo)      Cache   Translate Page   Web Page Cache   

Hopper Disassembler is a binary disassembler, decompiler, and debugger for 32- and 64-bit executables. It will let you disassemble any binary you want, and provide you all the information about its content, like imported symbols, or the control flow graph! Hopper can retrieve procedural information about the disassembled code like the stack variables, and lets you name all the objects you want.

Hopper is able to transform the assembly language into a pseudo-code that is easier to understand! You can use its internal Python scripting engine to analyze binaries the way you want (this feature works only with Lion)! Starting from version 2.0, Hopper can even use GDB to debug programs!

And, last but not least, unlike all other tools of its kind, Hopper is perfectly integrated into the OS X environment.



Version 4.3.27-demo:
  • Implements most ARM and AArch64 relocation for ELF files,
  • Fixes a slowness during the analysis of file containing big BSS sections,
  • Fixes function pointer signature editor on Linux.


  • OS X 10.9 or later



More information

Download Now
          Recommend Some Schools Please      Cache   Translate Page   Web Page Cache   
Demographics:Turkish Male will start 9th grade in September looking to transfer to a boarding school. Will not as for financial aid. Grades:Not sure what an A qualifies as but my weighted gpa is: 94.5 8th grade 93 7th grade We also are not allowed to pick our courses here so I took the same courseload as anyone else. Test Scores: Will take the sat in October but the first practice test I took was in the low 1300s(I haven't learned some of the stuff covered in math so probably will improve) World Scholar’s Cup(International Debate,Writing,Bowl,Challenge(Examination of 6 subjects).(2 years) -Qualified for the Tournament of Champions 2 years in a row. -Regional(Istanbul+Izmir),Global(Hanoi+Barcelano),ToC(at Yale) Digital Art(4 years) -Logo Design Work -Lead a club for 2 years(teaching 5-6th graders photoshop and illustrator) Tennis -Playing since 1st grade. -Attended many tournaments with many awards -Currently playing as a hobby as injuries and a long time overseas didn't allow me to play for a long time. Hour of Code(4 years) -Participated in the hour of code activities in hour school(2 years as a teacher and 2 as a learner) -Taught the basis of algorithms to elementary students, scratch for middle schoolers, python for high schoolers,C# for teachers. Enable The Future(Robotel) (1 year) -Volunteered in 3d printing prosthetic arms and hands for children in need. -Worked with students from the high school with a similar project as middle schoolers. MUN(2 years) -4 regional competition as delegates -1 competition as chair. -They didn't do awards in the conferences I attended Marching Band(4 years, drum major for 1 year) -Played snare drums for a year Arduino Club (3 years) -Built arduino powered circuits most notably made the device for engine control for our school's rocket team. Thank you for reading! Any advice and recommendations are welcome!
          (USA-NY-Rochester) Analyst/Programmer, RDIA Group CTSI      Cache   Translate Page   Web Page Cache   
Opening Full Time 40 hours Grade 053 Clin & Trans Science Institute Schedule 8 AM-5 PM Responsibilities **Organization:** Research Data Integration and Analytics (RDIA) group, Informatics Division of the Clinical and Translational Science Institute, School of Medicine and Dentistry, University of Rochester. The RDIA group provides state-of-the-art integrative biomedical research data management and analytic services for research programs within the University of Rochester Medical Center. **Position Summary:** This position involves the development, evaluation, and testing of web-based database applications, data integrations, and programming to manage workflows for clinical and experimental data and document procedures used. This includes integration with clinical and specimen metadata for several large research centers, as well performing data quality assurance. Tasks also include formatting data for submission to public genomics data repositories. The candidate will work under general supervision of the RDIA Technical Lead, with some latitude for independent judgment, working with a team of other developers and data managers. **Job duties:** Collects and analyzes user requirements and system capabilities: * Meets with principal investigators, lab personnel, and statisticians involved in research studies to understand volume, frequency, and format of data to be collected, data workflows to be supported, and analytic tools to be implement. Adapts existing data management applications to meet project requirements: * Where possible, uses features of LabKey Server system to implement, design, test, and track data collection, workflows, and analysis.This includes use of the wikis, file content, lab assay modules, study, issues, and query modules of LabKey Server. Custom, project specific programming (Java, JavaScript, SQL): * Primarily focused on custom programming for assay and experimental data management and integration. * Builds, evaluates, tests, and maintains custom web-based data collection forms using JavaScript, HTML, LabKey APIs, and in-house-developed libraries. * Builds custom reports using LabKey Server SQL. * Builds, evaluates, tests, and maintains custom LabKey Server modules using Java, LabKey Server APIs, and in-house developed libraries. Data management tasks; Data QC/QA and scripting: * Builds custom external programs using Java, Python, LabKey Client APIs, and SQL to automate data cleaning, data transformation, data QC and reporting. Attends project meetings, meets with supervisors, makes recommendations and attends educational seminars **Requirements:** * Bachelor’s degree in Software Engineering, computer science or related field (Master’s preferred) and 2-3 years of related experience; or an equivalent combination of education and experience. * Experience programming in Java and developing J2EE web applications required. * Experience programming in web technologies (HTML, CSS and JavaScript) required. * Experience with command line Linux environment; basics include directory and file management, file permissions, rsync, grep, awk, sed. * Experience with scripting data transformation in R and/or Python a plus. * Experience using relational databases.Working knowledge of SQL required.DBA experience (especially PostgreSQL) a plus. * Experience with High Performance Computing environment (SGE, PBS, SLURM) a plus. * Experience working with next generation sequencing data and related data formats a plus. * Experience with LabKey Server software a plus. * Excellent attention to details, ability to work and communicate well with multi-disciplinary team is required. * Must be able to work on-site. The University of Rochester is committed to fostering, cultivating and preserving a culture of diversity and inclusion. The University believes that a diverse workforce and inclusive workplace culture enhances the performance of our organization and our ability to fulfill our important missions. The University is committed to fostering and supporting a workplace culture inclusive of people regardless of their race, ethnicity, national origin, gender, sexual orientation, socio-economic status, marital status, age, physical abilities, political affiliation, religious beliefs or any other non-merit fact, so that all employees feel included, equal valued and supported. *EOE Minorities/Females/Protected Veterans/Disabled* *Job Title:* Analyst/Programmer, RDIA Group CTSI *Location:* School of Medicine & Dentistry *Job ID:* 210190 *Regular/Temporary:* Regular *Full/Part Time:* Full-Time
          (USA-CA-San Jose) Python Engineer Software 2      Cache   Translate Page   Web Page Cache   
At Northrop Grumman, our work with **cutting-edge technology** is driven by something **human** : **the lives our technologies protects** . It's the value of innovation that makes a difference today and tomorrow. Here you'll have the opportunity to connect with coworkers in an environment that's uniquely caring, diverse, and respectful; where employees share experience, insights, perspectives and creative solutions through integrated product & cross-functional teams, and employee resource groups. Don't just build a career, build a life at Northrop Grumman. The Cyber Intelligence Mission Solutions team is seeking an Engineer Software 2 to join our team in San Jose as we kick off a new 10 year program to protect our nation's security. You will be using your Python skills to perform advanced data analytics on a newly architected platform. Hadoop, Spark, Storm, and other big data technologies will be used as the basic framework for the program's enterprise. **Roles and Responsibilities:** + Python development of new functionality and automation tools using Agile methodologies + Build new framework using Hadoop, Spark, Storm, and other big data technologies + Migrate legacy enterprise to new platform + Test and troubleshoot using Python and some Java on Linux + Function well as a team player with great communication skills **Basic Qualifications:** + Bachelor Degree in a STEM discipline (Science, Technology, Engineering or Math)from an accredited institutionwith 2+ years of relevant work experience, or Masters in a STEM discipline with 0+ years of experience + 1+ years of Python experience in a work setting + Active SCI clearance **Preferred Qualifications:** + Machine learning / AI / Deep Learning / Neural Networks + Familiar withHadoop, Spark or other Big Data technologies + Familiar with Agile Scrum methodology + Familiar withRally, GitHub, Jenkins, Selenium applications Northrop Grumman is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, creed, sex, sexual orientation, gender identity, marital status, national origin, age, veteran status, disability, or any other protected class. For our complete EEO/AA statement, please visit www.northropgrumman.com/EEO . U.S. Citizenship is required for most positions.
          (USA-VA-Fairfax) Systems Administrator 2/3 TS/SCI      Cache   Translate Page   Web Page Cache   
Northrop Grumman Mission Systems is seeking an experienced Ground Controller (GC) near the Dulles Technology Corridor to support an off-site System and Operation & Maintenance program operating across multiple geographic locations 24/7. The GC is responsible for monitoring satellite network usage and capacity. Identifies and recommends solutions for smooth continuing operation of network as capacity needs change. Makes corrections to network configurations to ensure efficient operation. Monitors and documents traffic analysis. Monitors all hardware components to ensure terrestrial and satellite network communication facilities are operating at optimum levels The ideal candidate will have demonstrated fundamental analytical skills, experience applying engineering discipline best practices, experience assessing complex systems and associated data, system development life-cycles, and electronic system operations. The candidates must obtain and maintain certifications for the specific positions they will be operating. Training for the certifications will be provided via Computer Based Training (CBT), formal classroom, lecture and practical application to on the job training (OJT). The candidates will be responsible for (but not limited to) the following tasks: Perform technical planning, system integration, verification and validation, risk and opportunity assessments, and supportability and effectiveness analyses for total systems. Perform functional analysis, timeline analysis, risk assessments, system data trending & performance analysis, requirements allocation and interface definition studies. Devise modeling and measuring techniques; utilize mathematics, statistical methods, engineering methods, operational mathematics techniques (linear programming, game theory, probability theory, symbolic language, etc.), and other principles and laws of scientific disciplines. Support the identification, development and improvement of system operation and maintenance processes/products Supporting system problem/anomaly resolution; investigate and diagnose unexpected system signatures Identify and implement new or improved tools and processes to increase team efficiency Candidates should be prepared to work rotating shifts in support of 24/7 operations. A shift differential premium may be applied towards shift work. **Basic Qualifications:** Basic qualifications for a level 2 Bachelor's Degree with 2 years of experience or a Master's Degree with 0 years of experience Additional experience may be considered in lieu of a degree. Basic qualifications for a Level 3 are a Bachelor's Degree with 5 years of experience, Master's Degree with 3 years of experience, Ph.D. with 0 years of experience. Additional experience may be considered in lieu of a degree. Knowledge of system life cycle and system engineering best practices; and applying engineering disciplines to real world activities/systems. Experience with high-level coding languages and scripting/interpreted languages, such as C/C++, Visual Basic, Java, JavaScript, Python, Perl, Shell Script, bash, and awk. Familiarity with Windows, UNIX, and/or Linux operating systems. Ability to work at a within a multi-contractor (badgeless) team environment. U.S. Citizenship required. A current/active TS/SCI clearance is required. Candidate must be willing to successfully complete a polygraph. **Preferred Qualifications:** Bachelor's or Master's Degree in a Science, Technology, Engineering or Mathematics (STEM) discipline from an accredited university Familiarity with software development environments and ISR systems Proficiency with MS Office applications (e.g., Outlook, Word, Excel, PowerPoint, Project, Access, Visio). Familiarity with Nagios, ClearQuest, Oracle, MySQL, MATLAB and SharePoint, or related applications. Basic understanding of Trigonometry Experience determining whether or not the system performance meets mission requirements. COMSEC experience with managing, maintaining and re-keying of hardware. Strong interpersonal skills; a positive, helpful, and professional demeanor. Strong verbal/written communication skills, along with the ability to work well within a team environment An active TS/SCI with polygraph clearance Northrop Grumman is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action Employer, making decisions without regard to race, color, religion, creed, sex, sexual orientation, gender identity, marital status, national origin, age, veteran status, disability, or any other protected class. For our complete EEO/AA and Pay Transparency statement, please visit www.northropgrumman.com/EEO . U.S. Citizenship is required for most positions.
          (USA-MD-Annapolis Junction) Cyber DevOps Support Lead      Cache   Translate Page   Web Page Cache   
How do cyber terrorists get past the industries best? They don't. There are too many of us fighting virtual threats, protecting enterprises and entire countries from large-scale attacks. From creating a citywide wireless network for our first responders, to protecting our nation from cyber threats, to building software-defined radios that change how our military communications, our Information Systems team helps life run smoothly and safely. If you are the sort of person who loves a challenge and likes to be involved in serious organizational and software change -- Then what is happening in the Cyber and Intelligence Mission Solutions Division is the place for you. This is without a doubt one of the most interesting and exciting times to join an organization like ours. The culture is one of excellence; team work, learning, delivered value and people. We are looking for people who love to learn and take initiative to really make this happen. Northrop Grumman Mission Systems is seeking a Cyber DevOps Support Lead to join our team of qualified, diverse individuals located at Annapolis Junction, MD. **Roles and Responsibilities:** This individual would spend 50% of their time on an operations floor assisting operators with using solutions developed on the program in order to complete Cyber operation successfully. With the other 50% of their time, the person will have a variety of duties including software development, and being the engineering lead for the site. This would entail being responsible for a lot of customer coordination and communication, helping with interviewing, managing the logistics of the space, etc. **Basic Qualifications:** + 9+ years' of experience as a software engineer + C Programming + Ability to read and write Python & Bash scripts + Background supporting Cyber programs + Familiarity and understanding of networking and basic terminology surrounding how network communications work. **Preferred Qualifications:** * Experience with VM's OR Containerization **Education:** + Bachelor's Degree in Computer Science or related discipline from an accredited college or university is required. **Security Clearance:** An active TS/SCIclearance is required. Northrop Grumman is committed to hiring and retaining a diverse workforce. We are proud to be an Equal Opportunity/Affirmative Action Employer, making decisions without regard to race, color, religion, creed, sex, sexual orientation, gender identity, marital status, national origin, age, veteran status, disability, or any other protected class. For our complete EEO/AA and Pay Transparency statement, please visit www.northropgrumman.com/EEO . U.S. Citizenship is required for most positions.
          Comment on PyTorch Releases Major Update, Now Officially Supports Windows by Charles Rose      Cache   Translate Page   Web Page Cache   
Hi, This is my problem: I cannot install PYTORCH in my SONY VAIO Windows 10 laptop. I get error messages, e.g., conda not found, conda not recognized, etc. I have installed ANACONDA and PYTHON 3.7 on my laptop without problems. QUESTION: What is the fool-proof method to install PYTORCH in my machine? Thanks. P.S.- If you wish, you can send me an email to cbrose@rogers.com
          ‘Medically unfit’ Vikas Gupta quits Khatron Ke Khiladi, returns to India after snakebite      Cache   Translate Page   Web Page Cache   

TV producer and former Bigg Boss contestant Vikas Gupta has quit the Khatron Ke Khiladi and returned to India after being bitten by a python. Confirming the news, Vikas told an entertainment portal that he couldn’t continue on the reality show since he was declared medically unfit after the snakebite incident. “Yes, I am back to India. […]

‘Medically unfit’ Vikas Gupta quits Khatron Ke Khiladi, returns to India after snakebite


          (USA) Senior Software Engineer - C++/Java      Cache   Translate Page   Web Page Cache   
Senior Software Engineer - C++/Java Job Summary Apply Now + Job:19042-MKAI + Location:US-MA-Natick + Department:Product Development Would you like to enable technical professionals and researchers to spend more time on research and development and less time writing code? In particular, help develop and advance MATLAB’s ability to interface with external languages and object systems such as Java, C++, and Python. Leverage your experience in C++, Java and system level programming to enable seamless integration between MATLAB and commonly used libraries. Responsibilities + Contributing to all activities of software development including requirements analysis, design, implementation, integration, and testing. + Partner with technical marketing and cross functional teams to gather user requirements and assess opportunities. + Develop new product features and improve existing features as part of a strong development team. + Conduct design reviews with peers and advisors. + Work closely with Quality Engineering to develop testing strategies for new features + Use build and debug tools in Windows, Linux and OS X. Minimum Qualifications + A bachelor's degree and 7 years of professional work experience (or a master's degree and 5 years of professional work experience, or a PhD degree) is required. + Experience with C++ + Experience with Java Native Interface + Experience with Java Technologies Additional Qualifications + Core Java + A firm grasp of the Software Development Life Cycle: iterative development, high-quality maintainable code, unit tests. Nice to have + Experience with MATLAB Why MathWorks? It’s the chance to collaborate with bright, passionate people. It’s contributing to software products that make a difference in the world. And it’s being part of a company with an incredible commitment to doing the right thing – for each individual, our customers, and the local community. MathWorks develops MATLAB and Simulink, the leading technical computing software used by engineers and scientists. The company employs 4000 people in 16 countries, with headquarters in Natick, Massachusetts, U.S.A. MathWorks is privately held and has been profitable every year since its founding in 1984. Apply Now Contact usif you need reasonable accommodation because of a disability in order to apply for a position. The MathWorks, Inc. is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other protected characteristics. View TheEEO is the Law posterandits supplement. The pay transparency policy is availablehere. MathWorks participates in E-Verify. View the E-Verify postershere. Apply Now + Job:19042-MKAI + Location:US-MA-Natick + Department:Product Development
          (USA-CA-San Francisco) Sr Automation Test Engineer      Cache   Translate Page   Web Page Cache   
The Federal Reserve Bank of San Francisco’s Information Technology Services organization has an excellent opportunity for a Software Quality Automation Test Engineer. The individual will have successfully implemented QTP, Selenium and JUNIT based test frameworks as well as qualified software components as part of large, highly integrated, enterprise-wide development projects. The position requires strong systems thinking ability, integration test expertise, white and gray box analytical skills, attention to detail and significant experience working with databases and SQL. Job Responsibilities: As a member of the Application Delivery Services team, you will pursue a variety of tasks that include: * Build and improve automated frameworks including the coding of reusable components and functions * Design automated test strategies and guide decisions to successful implementation and completion * Develop, update and automate test cases and train others on standard methodologies for automated test case design * Craft detailed test designs, conditions and have them executed * Provide traceability from business requirements and technical specifications to test conditions and defect reports * Analyze and test user and system interface functionality * Participate in detailed review walkthrough sessions * Track issues to successful resolution * Collect metrics to measure test progress & quality outcomes * Participate in special projects as assigned * Developing and crafting process improvements * Documentation of workflows * Auditing of process compliance & report preparation Then you would: * have an understanding of software development processes and an excellent understanding of the Java technology stack * have broad test automation skills, capability to build from scratch automated test scripts * possess excellent written and verbal interpersonal skills * be a stickler for quality, attention to detail, and maintenance of high standards Job Requirement: * Be a true believer in agile development methodology, experience working with agile tools such as Version One, JIRA * Experience building automated, keyword, data-driven test framework from scatch * Unit Testing using Eclipse/RAD, Selenium, Junit framework * 3 years’ experience in descriptive programming * Authoritative level experience with either SQL or Oracle DBMS required * 5 years SQA experience in complex technical environments * API level testing with SOAPUI * Knowledge of process analysis understanding of workflows, process documentation and principles * Experience with Load testing tools such as LoadUI or JMeter * Degree in computer science or related discipline, or equivalent work experience and training Desired Qualification: * Authoritative level experience with Software QA experience * Java, Net or Python script certifications * Excellent Presentation skills * Web page or content management skills. * U.S. Citizenship, or hold a permanent resident/green card with intent to become a U.S. Citizen is required for this role. At the Federal Reserve Bank of San Francisco, we offer a wonderful benefits package including: Medical, Dental, Vision, Pre-tax Flexible Spending Account, Backup Child Care Program, Pre-tax Day Care Flexible Spending Account, Vacation Days, Sick Days, Paid Holiday’s, Pet Insurance, Matching 401(k), and an unheard-of Retirement / Pension. The Federal Reserve Bank of San Francisco is an Equal Opportunity Employer. Our people proudly reflect the diversity and ideas of the communities we serve. **Organization:** **Federal Reserve Bank of San Francisco* **Title:** *Sr Automation Test Engineer* **Location:** *CA-San Francisco* **Requisition ID:** *256679*
          (USA-MA-Cambridge) Training Manager      Cache   Translate Page   Web Page Cache   
Training Manager + Job Number: 16412 + Functional Area: Research - Other + Department: Abdul Latif Jameel Poverty Action Lab + School Area: Humanities, Arts, & Social Sciences + Employment Type: Full-Time + Employment Category: Exempt + Visa Sponsorship Available: No + Schedule: Email a Friend Save Save Apply with LinkedIn Apply Now Working at MIT offers opportunities, an environment, a culture – and benefits – that just aren’t found together anywhere else. If you’re curious, motivated, want to be part of a unique community, and help shape the future – then take a look at this opportunity. TRAINING MANAGER, Abdul Latif Jameel Poverty Action Lab (J-PAL)-Global-Research, Education, and Training (RET) Team, to take a lead role in managing J-PAL’s training portfolio and advising its worldwide offices on trainings. This includes managing the collection and analysis of data on the reach and impact of trainings and their contribution to J-PAL’s mission and working with senior management on strategic planning for the training group. Responsibilities include managing J-PAL Global’s efforts to train researchers, research staff, and policymakers in staff trainings, executive education programs, and online courses; overseeing the work of a team of training staff members who build out data analysis tools and new reference and teaching materials and act as teaching assistants/support for online courses; leading the design, organization, and implementation of in-person courses; representing J-PAL in interactions with high-level government/foundation/NGO policymakers and affiliate researchers; and working closely with regional offices and partner organizations. Job Requirements REQUIRED: master’s in economics, political science, education, public policy, international development, or related field and at least four years’ relevant experience or a Ph.D. and one year of experience (may include graduate school teaching experience); training in and strong technical understanding of randomized impact evaluations; comfort making presentations and preparing tailored teaching materials; experience in academic/non-academic teaching environments and/or conducting field research in developing countries; strong interest in poverty alleviation policy and research; knowledge of data management, data analysis, and econometrics; PowerPoint and Excel proficiency; and good command of Stata or knowledge of R, Python, or other analytics programs and ability/willingness to quickly learn Stata. Experience in management and collaboration on international teams a plus. International development experience or extensive experience with social policy in other settings desired. Job #16412For more information and additional application instructions, please visit https://www.povertyactionlab.org/careers/manager-research-education-and-training-focus-training-j-pal-global-102647. The position involves frequent travel liaise with partners and lead the implementation of J-PAL’s training strategy across the organization. 8/9/18 MIT is an equal employment opportunity employer. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, sex, sexual orientation, gender identity, religion, disability, age, genetic information, veteran status, ancestry, or national or ethnic origin.
          (USA-MA-Boston) Software Engineer (Node.js/Python)      Cache   Translate Page   Web Page Cache   
Software engineers on our team work closely with Machine Learning engineers to create a smarter, personalized learning journey for our users. You will be working on a cross-functional team with a Product Manager, UX Designer, DevOps Engineer, Machine Learning Engineers, and Software Engineers. You’ll be part of a team that is user focused, has a mentality for experimentation, and iterates quickly. *Ways we work:* * Autonomous & responsible teams - making their own product & dev choices (https://www.pluralsight.com/tech-blog/team-responsibilty) * Data first - we work with large volumes of data to build scalable solutions for our products * Software Craftsmanship - we want to be proud of our work o Pair programming - we value collaborative development o Test-driven development - we take responsibility for our code without QA engineers o Continuous delivery - teams independently ship code to production every day o Kanban & Lean - no more backlog grooming, no more T-shirt size estimating o Continual improvement - we hold weekly retrospectives in order to assess and improve system processes *What we create with:* * Backend - Node.js/Python * Testing - Mocha/Pytest * Declarative UI - React * Styling - CSS Modules <3 * Messaging - RabbitMQ * Database - PostgreSQL * Source Control - Github * Frameworks - Airflow/TensorFlow EOE Statement Be Yourself. Pluralsight is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status.
          ROS Recording, Gazebo Simulation Data II      Cache   Translate Page   Web Page Cache   
Purpose of this project: Create a sample data set from the perspective of an autonomous car in simulation. Usually autonomous cars have lidar and camera sensors. Gazebo can simulate both of them and there is a prebuilt city, vehicles and sensors available... (Budget: €30 - €250 EUR, Jobs: Python, Robotics)
          Comment on Python Resurrects Dot Matrix Printing by cyberteque      Cache   Translate Page   Web Page Cache   
inkjet printers are not occasional use printers, you need to use the damn things a few times a week! laser printers can sit for weeks or months between prints, they don't dry out!
          VIDEO: Terrifying moment escaped python slithers out of grid      Cache   Translate Page   Web Page Cache   
A MOTHER is concerned for her child’s welfare after a snake had to be captured from outside her block of flats.
          ROS Recording, Gazebo Simulation Data II      Cache   Translate Page   Web Page Cache   
Purpose of this project: Create a sample data set from the perspective of an autonomous car in simulation. Usually autonomous cars have lidar and camera sensors. Gazebo can simulate both of them and there is a prebuilt city, vehicles and sensors available... (Budget: €30 - €250 EUR, Jobs: Python, Robotics)
          Reading the NSA’s codebase: LemonGraph review–Part V–Query parsing      Cache   Translate Page   Web Page Cache   

I said before that I don’t want to get into the details of how LemonGraph is dealing with parsing the queries. Unfortunately, I can’t avoid that. There seems to be a lot of logic, magic and mystery in the MatchLGQL() class, which is critical to understanding how queries work.

The problem is that either my Python-fu is lacking or it is just really hard to figure out a non trivial codebase behavior in a dynamic language like python. I find it hard to figure out what data is stored where and how it is manipulated. Therefor, I decided to break with my usual custom and actually run the code in the debugger to try to follow what is going on there. I tried to run this on WSL, but it crashed horribly, so I had to spin up a VM and setup PyCharm on it. First time that I’m actually using that and the experience is pretty nice so far. Being able to inspect things directly means that it is much easier to figure out the behavior of the code.

In order to explore how queries work in LemonGraph, I created the following graph, which represents the relationships between my dogs:

image

Here is how this looks like in code:

This tells us to find all the dogs that like each other. And it finds:

  • Arava –> Oscar
  • Oscar –> Arava
  • Oscar –> Pheobe

Now that we have a query that we can sink our teeth into, let’s figure out how this work, shall we? Inside the dreaded MatchLGQL() class, there are all sorts of regular expressions running on the parse this thing, but eventually we get to the partially processed parsed query:

image

This screen shot might explain why I wasn’t happy with the code structure for figuring out what is going on without debugging. The number of tuples here is quite amazing, and they are used everywhere. This make static analysis (as in, just reading the code) too hard for me. But with the debugger, that is much easier. If you are familiar with ASTs, this should be pretty easy to figure out.

Here is a piece of code that we already looked at (and criticized), this is in munge_obj() method, where it is deciding how to optimize the query:

image

This piece of code is critical for the performance of the system. And it is really hard to understand. Here is what is going on.

The accel array tell a later piece of code how to accelerate the query, using the type or type and value to start from a particular source. The info is used to carry state about particular clause in the query. Before this code run there is some code that builds the dictionary d which is used to figure out the filters on the particular clause. This is fun, because it is using missing a key lookup in the dictionary for control flow.

Let’s follow the logic?

  • Line 2 - If the clause operates on a node, rank it as 6. If it is an edge, rank it as 7.
  • Line 6 – If the clause has a type specified, rank is as 4 if it is a node, 5 if it is an edge. Otherwise, abort the optimization.
    • You might not see the “abort this optimization” in line 6, because it relies on the dictionary to throw if the key isn’t found. This is a common pattern in this code and something that I personally greatly dislike.
  • Line 8 – it uses the length of the type as a metric for secondary ranking. I’m not quite sure why this is the case. I guess the code needed a tie breaker, but I can’t imagine why the length of a type would have any impact on performance.
    • Unless, of course, the code assumes that shorter types are more common, and therefor will prefer to use the rare longer types?
  • Line 10 – If there is a type and a value defined, that is even better. Note that again the is the ranking of node (2) and edge (3) which I find non obvious.

Here are the results of the matches after they have been munged, I marked the ranking:

image

Looking at this, this seems very strange, the rank2 value is 1 in the second element, but I expected it to be the length of the string. As it turns out, this is not working directly on the string, it is working on the tuple of possible values, so the secondary ranking here is not based on the length of the type or the value but on the number of possible types and values that were specified for each clause.

The code judges that the best place to start this query is with the second entry, since it is the most specific option. This in turn takes us the the seeds() method that we previously covered. In this case, the code is going to hit this branch:

image

This means that it is going to be iterating over all the edges of a particular type and filtering them in Python code. This is strange, because the on disk indexes actually support doing a direct query on the (type, value) directly and would probably be much cheaper in the case you have many values for a particular type of an edge.

In fact, just that is implemented for querying nodes by (type, value):

image

I’m guessing that they are either don’t have a lot of queries on (type, value) on edges or not a lot of different values for edge types that they can optimize in this manner.

That is enough for now, I have a pretty good grasp of how queries are parsed and how they fetch data from the underlying storage. The next post will talk about how LemonGraph takes the seeds of the query and execute the actual graph operations on them. The code that does this is tight and will require a full post to explore properly.


          Comment on What are the responsibilities of pro-feminist men in the Michael Kimmel case? by Alienigena      Cache   Translate Page   Web Page Cache   
Are men who claim to be feminist and have a large following just trying to expand the market for their brand (I prefer the term personality cult), that is, are they trying to recruit women as potential consumers. Because I do think a lot of these men are that cynical. Particularly those whose media empires expand to include feminist media outlets, or products. I am trying to not subsidize such men financially by not watching movies or tv shows they finance, produce, direct or act in that support an ideology of violence (gratuitous violence or pointless hetero sex (does not advance plot in any way) in action movies, horror movies, sci-fi, etc.) and not visiting websites owned by men or blogs written by men who supposedly have feminist cred. I don't have a PC imagination so will continue to find certain men humourous (e.g. Monty Python). I think women need to hold all men to account especially those who make claims on our time, attention, and money (and to whom you are not related, married to, a caregiver of, or in a relationship with).
          Software development specialist - québec, qc - Icentia - Québec City, QC      Cache   Translate Page   Web Page Cache   
Knowledge of several programming languages (Python, C, C #, javascript, C ++, ...), as well as some operating systems (Windows, Unix, Linux ...)....
From Icentia - Thu, 02 Aug 2018 18:05:14 GMT - View all Québec City, QC jobs
          calculator using python       Cache   Translate Page   Web Page Cache   
calculator using python

          python2-setuptools-scm 3.1.0-1 any      Cache   Translate Page   Web Page Cache   
Handles managing your python package versions in scm metadata.
          python-setuptools-scm 3.1.0-1 any      Cache   Translate Page   Web Page Cache   
Handles managing your python package versions in scm metadata.
          python2-stestr 2.1.1-1 any      Cache   Translate Page   Web Page Cache   
A test runner runner similar to testrepository
          python-stestr 2.1.1-1 any      Cache   Translate Page   Web Page Cache   
A test runner runner similar to testrepository
          Software development specialist - québec, qc - Icentia - Québec City, QC      Cache   Translate Page   Web Page Cache   
Knowledge of several programming languages (Python, C, C #, javascript, C ++, ...), as well as some operating systems (Windows, Unix, Linux ...)....
From Icentia - Thu, 02 Aug 2018 18:05:14 GMT - View all Québec City, QC jobs
          SQL DBA Best Online Training - Indore, India      Cache   Translate Page   Web Page Cache   
SQL School is one of the best training institutes for Microsoft SQL Server Developer Training, SQL DBA Training, MSBI Training, Power BI Training, Azure Training, Data Science Training, Python Training, Hadoop Training, Tableau Training, Machine Learni...
          Die beliebtesten Programmiersprachen: Python behauptet sich vor C++ und C      Cache   Translate Page   Web Page Cache   
Python hat seine Spitzenposition im IEEE-Ranking der beliebtesten Programmiersprachen verteidigt. Auf den Plätzen folgen C++ und C. R dagegen verliert zunehmend an Bedeutung. Im TIOBE-Index ist Python drauf und dran, erstmals die Top-3 zu erobern. Das konkurrierende Ranking der beliebtesten Programmiersprachen des Institute of Electrical and Electronics Engineers (IEEE) sieht Python schon seit dem vergangenen Jahr ganz vorn. In diesem Jahr hat die Programmiersprache ihren Spitzenplatz verteidigt. Dahinter folgen mit geringem Abstand C++ und C sowie Java und C#.

Programmiersprachen: Python gewinnt an Popularität

Als Gründe für den Aufstieg von Python sehen die Herausgeber des Rankings die Auflistung der Programmiersprache als eingebettete Sprache. Ebenfalls verantwortlich für die steigende Python-Popularität ist demnach der Rückgang von R, wie Heise Online berichtet. Nach Platz fünf im Jahr 2016 und Platz sechs im vergangenen Jahr steht R im 2018er-Ranking nur noch auf Platz sieben. R und Python konkurrieren bei Big-Data und Machine-Learning miteinander. [caption id="attachment_1100888" align="alignnone" width="620" class="tg-noadgoal"]Programmiersprachen Python Python führt das Programmiersprachen-Ranking des IEEE an. (Screenshot: Spectrum/t3n.de)[/caption] Im Trending-Bereich des Programmiersprachenindexes sticht unter anderem Googles Sprache Go hervor, die dort von Platz sieben auf fünf gestiegen ist. Einen noch größeren Sprung legte Scala hin – nämlich von Platz 15 auf acht. Die stark gestiegene Popularität von Scala könnte erklären, warum es für Java in diesem Jahr insgesamt um einen Platz bergab ging. Scala ist als Java-Optimierung entwickelt worden. Die ebenfalls beliebte Java-Alternative Kotlin ist übrigens nicht im Ranking vertreten. Die Mitarbeiter des IEEE-Fachmagazins Spectrum werten für das jährlich erscheinende Programmiersprachen-Ranking Suchtrends, Erwähnungen in Social Media, Zeitschriftenartikeln und Jobausschreibungen sowie Github-Repositories aus. Als Quellen werden neben Google und Twitter etwa Stack Overflow, Reddit und Hacker-News zurate gezogen. Mehr zum Thema:
          Validez l'email de vos utilisateurs avec Masonite Framework et JWT      Cache   Translate Page   Web Page Cache   

Il est très utile d’effectuer la vérification et de confirmer les emails des utilisateurs d’une application. Vous adorez Python & Laravel? Cet article montre comment implémenter de bout en bout cette vérification avec Masonite Framework, un framework qui vous fera aimer d’avantage le développement d’applications web avec Python.

1*1b01prhgzlxvmjwjxdb2tq
Commentaires
L'article Validez l'email de vos utilisateurs avec Masonite Framework et JWT a été posté dans la catégorie Python de Human Coders News
          Python Developer      Cache   Translate Page   Web Page Cache   
NY-New York, Python Developer Grab the opportunity to achieve your full potential! Eclaro is looking for a Python Developer for our client in New York, NY. Eclaro’s client is one of the world's largest financial institutions, committed to providing the tools and services that bridge the gap between customers and their goals. If you’re up to the challenge, then take a chance at this rewarding opportunity! Respo
          Senior Data Analyst - William E. Wecker Associates, Inc. - Jackson, WY      Cache   Translate Page   Web Page Cache   
Experience in data analysis and strong computer skills (we use SAS, Stata, R and S-Plus, Python, Perl, Mathematica, and other scientific packages, and standard...
From William E. Wecker Associates, Inc. - Sat, 23 Jun 2018 06:13:20 GMT - View all Jackson, WY jobs
          Mike C. Fletcher: TTFQuery 2.0.0b1 Up on PyPI      Cache   Translate Page   Web Page Cache   

TTFQuery has a new release up. This release has a bunch of small breaking changes in it, specifically the command line demonstration tools now work differently. It also is now Python 3 ready (i.e. one more package should now be out of the way to get OpenGLContext running under Python 3) and finally has its own (basic) test-suite instead of relying on OpenGLContext to exercise it. It's also got a bit more documentation. You should be able to pull it with:

pip3 install 'ttfquery==2.0.0b1'

or, if you want to run the test suite (expect it to fail under Win32 or Mac, as I don't have those):

pip install tox
bzr branch lp:ttfquery
cd ttfquery
tox

Enjoy.


          Not Invented Here: CPython vs PyPy Memory Usage      Cache   Translate Page   Web Page Cache   

If you have lots of "small" objects in a Python program (objects which have few instance attributes), you may find that the object overhead starts to become considerable. The common wisdom says that to reduce this in CPython you need to re-define the classes to use __slots__, eliminating the attribute dictionary. But this comes with the downsides of limiting flexibility and eliminating the use of class defaults. Would it surprise you to learn that PyPy can significantly, and without any effort by the programmer, reduce that overhead automatically?

Let's take a look.

Contrary to advice, instead of starting at the very beginning, we'll jump right to the end. The following graph shows the peak memory usage of the example program we'll be talking about in this post across seven different Python implementations: PyPy2 v6.0, PyPy3 v6.0, CPython 2.7.15, 3.4.9, 3.5.6, 3.6.6, and 3.7.0 [1].

For regular objects ("Point3D"), PyPy needs less than 700MB to create 10,000,000, where CPython 2.7 needs almost 3.5 GB, and CPython 3.x needs between 1.5 and 2.1 GB [6]. Moving to __slots__ ("Point3DSlot") brings the CPython overhead closer to—but still higher than—that of PyPy. In particular, note that the PyPy memory usage is essentially the same whether or not slots are used.

#chart-2b5bfb59-d50b-477e-886f-0693dd20f528{-webkit-user-select:none;-webkit-font-smoothing:antialiased;font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .title{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:16px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .legends .legend text{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:14px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis text{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:10px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis text.major{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:10px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay text.value{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:16px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay text.label{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:10px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:14px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 text.no_data{font-family:Consolas,"Liberation Mono",Menlo,Courier,monospace;font-size:64px} #chart-2b5bfb59-d50b-477e-886f-0693dd20f528{background-color:rgba(249,249,249,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 path,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 rect,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 circle{-webkit-transition:150ms;-moz-transition:150ms;transition:150ms}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .graph > .background{fill:rgba(249,249,249,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .plot > .background{fill:rgba(255,255,255,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .graph{fill:rgba(0,0,0,.87)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 text.no_data{fill:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .title{fill:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .legends .legend text{fill:rgba(0,0,0,.87)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .legends .legend:hover text{fill:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis .line{stroke:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis .guide.line{stroke:rgba(0,0,0,.54)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis .major.line{stroke:rgba(0,0,0,.87)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis text.major{fill:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.y .guides:hover .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .line-graph .axis.x .guides:hover .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .stackedline-graph .axis.x .guides:hover .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .xy-graph .axis.x .guides:hover .guide.line{stroke:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis .guides:hover text{fill:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .reactive{fill-opacity:.7;stroke-opacity:.8}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .ci{stroke:rgba(0,0,0,.87)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .reactive.active,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .active .reactive{fill-opacity:.8;stroke-opacity:.9;stroke-width:4}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .ci .reactive.active{stroke-width:1.5}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .series text{fill:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip rect{fill:rgba(255,255,255,1);stroke:rgba(0,0,0,1);-webkit-transition:opacity 150ms;-moz-transition:opacity 150ms;transition:opacity 150ms}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip .label{fill:rgba(0,0,0,.87)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip .label{fill:rgba(0,0,0,.87)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip .legend{font-size:.8em;fill:rgba(0,0,0,.54)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip .x_label{font-size:.6em;fill:rgba(0,0,0,1)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip .xlink{font-size:.5em;text-decoration:underline}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip .value{font-size:1.5em}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .bound{font-size:.5em}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .max-value{font-size:.75em;fill:rgba(0,0,0,.54)}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .map-element{fill:rgba(255,255,255,1);stroke:rgba(0,0,0,.54) !important}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .map-element .reactive{fill-opacity:inherit;stroke-opacity:inherit}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-0,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-0 a:visited{stroke:#F44336;fill:#F44336}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-1,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-1 a:visited{stroke:#3F51B5;fill:#3F51B5}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-2,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-2 a:visited{stroke:#009688;fill:#009688}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-3,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-3 a:visited{stroke:#FFC107;fill:#FFC107}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-4,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-4 a:visited{stroke:#FF5722;fill:#FF5722}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-5,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-5 a:visited{stroke:#9C27B0;fill:#9C27B0}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-6,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .color-6 a:visited{stroke:#03A9F4;fill:#03A9F4}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay .color-0 text{fill:black}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay .color-1 text{fill:black}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay .color-2 text{fill:black}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay .color-3 text{fill:black}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay .color-4 text{fill:black}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay .color-5 text{fill:black}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .text-overlay .color-6 text{fill:black} #chart-2b5bfb59-d50b-477e-886f-0693dd20f528 text.no_data{text-anchor:middle}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .guide.line{fill:none}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .centered{text-anchor:middle}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .title{text-anchor:middle}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .legends .legend text{fill-opacity:1}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.x text{text-anchor:middle}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.x:not(.web) text[transform]{text-anchor:start}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.x:not(.web) text[transform].backwards{text-anchor:end}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.y text{text-anchor:end}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.y text[transform].backwards{text-anchor:start}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.y2 text{text-anchor:start}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.y2 text[transform].backwards{text-anchor:end}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis .guide.line{stroke-dasharray:4,4}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis .major.guide.line{stroke-dasharray:6,6}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .horizontal .axis.y .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .horizontal .axis.y2 .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .vertical .axis.x .guide.line{opacity:0}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .horizontal .axis.always_show .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .vertical .axis.always_show .guide.line{opacity:1 !important}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.y .guides:hover .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.y2 .guides:hover .guide.line,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis.x .guides:hover .guide.line{opacity:1}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .axis .guides:hover text{opacity:1}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .nofill{fill:none}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .subtle-fill{fill-opacity:.2}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .dot{stroke-width:1px;fill-opacity:1}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .dot.active{stroke-width:5px}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .dot.negative{fill:transparent}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 text,#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 tspan{stroke:none !important}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .series text.active{opacity:1}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip rect{fill-opacity:.95;stroke-width:.5}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .tooltip text{fill-opacity:1}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .showable{visibility:hidden}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .showable.shown{visibility:visible}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .gauge-background{fill:rgba(229,229,229,1);stroke:none}#chart-2b5bfb59-d50b-477e-886f-0693dd20f528 .bg-lines{stroke:rgba(249,249,249,1);stroke-width:2px}Memory Usage (in MB)00400400800800120012001600160020002000240024002800280032003200Point3DPoint3DSlotPoint3DSlot Uncached Integers691.935.844688644688645454.6035742845751Point3D692231.87032967032968454.5964183645966Point3DSlot1152.2427.8959706959707421.6648746238269Point3DSlot Uncached Integers683.660.48791208791208455.1975156427854Point3D683.8256.5135531135531455.18320380282853Point3DSlot1143.57452.53919413919414422.2824305179661Point3DSlot Uncached Integers3453.385.13113553113551257.0Point3D787.92281.15677655677655447.7324599212793Point3DSlot1624.7477.18241758241754387.8531527257094Point3DSlot Uncached Integers1521.4109.77435897435898395.2452180634354Point3D709.2305.79999999999995453.36560012830535Point3DSlot1624.54501.825641025641387.8646021976749Point3DSlot Uncached Integers1546.1134.41758241758242393.4777058287614Point3D709330.4432234432234453.37991196826226Point3DSlot1624.7526.4688644688645387.8531527257094Point3DSlot Uncached Integers1522.3159.06080586080586395.1808147836295Point3D704.3355.0864468864469453.7162402072488Point3DSlot1624.8551.1120879120879387.845996805731Point3DSlot Uncached Integers2154.9183.70402930402932349.9124650000334Point3D1009.6379.7296703296703431.8692165130789Point3DSlot2869.6575.7553113553114298.76910491414014Point3DSlot Uncached IntegersMemory Usage (in MB)PyPy2PyPy3CPython 2.7CPython 3.4CPython 3.5CPython 3.6CPython 3.7

The third group of data is the same as the second group, except instead of using small integers that should be in the CPython internal integer object cache [7], I used larger numbers that shouldn't be cached. This is just an interesting data point showing the allocati