Next Page: 10000

          Driver Talent Pro.7.1.4.22 Multilingual      Cache   Translate Page      
Driver Talent Pro.7.1.4.22 Multilingual

Driver Talent Pro 7.1.4.22 Multilingual | 15.2 Mb

Driver Talent is able to automatically download and install the latest updates for all of the drivers for all of your components. The program includes an extensive database with thousands of drivers for all kinds of devices, including printers, monitors, keyboards, sound cards, video cards and more. With Driver Talent there is no need to worry about losing drivers again. Driver Talent backup and reinstall features can save you hours of searching for and installing individual device drivers.

          SQL Server: Security for Developers      Cache   Translate Page      
SQL Server: Security for Developers
SQL Server: Security for Developers
MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 1.5 Hours | 183 MB
Genre: eLearning | Language: English

Learn how to protect databases and preserve the integrity of an organization's data by configuring the security settings in SQL Server. This course covers how to use built-in options on Microsoft platforms, including Azure AD, to secure database and network infrastructure.


          MS Access SME/Consultant      Cache   Translate Page      
CA-Burbank, Burbank, California Skills : Windows 98, MS Access, Windows 2000, Win32 Description : Description • Candidate should be expert in MS Access Database. • Candidate will be responsible for creating MS Access database and performing CURD operation along with interfacing with front end application Required Skills • MS Access – SME level
          Oracle DBA With AVDF/AV      Cache   Translate Page      
CA-Los Angeles, Job Purpose: Ensures database performance by planning, developing, managing, and securing Sybase databases. Duties: * Meets enterprise requirements by planning and developing database structure; developing database utilization policies, procedures, security, backup, and recovery. * Confirms database requirements by conferring with clients; studying operations and objectives; integrating enterprise
          Build a Java REST API with Java EE and OIDC      Cache   Translate Page      

Java EE allows you to build Java REST APIs quickly and easily with JAX-RS and JPA. Java EE is an umbrella standards specification that describes a number of Java technologies, including EJB, JPA, JAX-RS, and many others. It was originally designed to allow portability between Java application servers, and flourished in the early 2000s. Back then, application servers were all the rage and provided by many well-known companies such as IBM, BEA, and Sun. JBoss was a startup that disrupted the status quo and showed it was possible to develop a Java EE application server as an open source project, and give it away for free. JBoss was bought by RedHat in 2006.

In the early 2000s, Java developers used servlets and EJBs to develop their server applications. Hibernate and Spring came along in 2002 and 2004, respectively. Both technologies had a huge impact on Java developers everywhere, showing them it was possible to write distributed, robust applications without EJBs. Hibernate’s POJO model was eventually adopted as the JPA standard and heavily influenced EJB as well.

Fast forward to 2018, and Java EE certainly doesn’t look like it used to! Now, it’s mostly POJOs and annotations and far simpler to use.

Why Build a Java REST API with Java EE and Not Spring Boot?

Spring Boot is one of my favorite technologies in the Java ecosystem. It’s drastically reduced the configuration necessary in a Spring application and made it possible to whip up REST APIs in just a few lines of code. However, I’ve had a lot of API security questions lately from developers that aren’t using Spring Boot. Some of them aren’t even using Spring!

For this reason, I thought it’d be fun to build a Java REST API (using Java EE) that’s the same as a Spring Boot REST API I developed in the past. Namely, the “good-beers” API from my Bootiful Angular and Bootiful React posts.

Use Java EE to Build Your Java REST API

To begin, I asked my network on Twitter if any quickstarts existed for Java EE like start.spring.io. I received a few suggestions and started doing some research. David Blevins recommended I look at tomee-jaxrs-starter-project, so I started there. I also looked into the TomEE Maven Archetype, as recommended by Roberto Cortez.

I liked the jaxrs-starter project because it showed how to create a REST API with JAX-RS. The TomEE Maven archetype was helpful too, especially since it showed how to use JPA, H2, and JSF. I combined the two to create my own minimal starter that you can use to implement secure Java EE APIs on TomEE. You don’t have to use TomEE for these examples, but I haven’t tested them on other implementations.

If you get these examples working on other app servers, please let me know and I’ll update this blog post.

In these examples, I’ll be using Java 8 and Java EE 7.0 with TomEE 7.1.0. TomEE 7.x is the EE 7 compatible version; a TomEE 8.x branch exists for EE8 compatibility work, but there are no releases yet. I expect you to have Apache Maven installed too.

To begin, clone our Java EE REST API repository to your hard drive, and run it:

git clone https://github.com/oktadeveloper/okta-java-ee-rest-api-example.git javaee-rest-api
cd javaee-rest-api
mvn package tomee:run

Navigate to http://localhost:8080 and add a new beer.

Add beer

Click Add and you should see a success message.

Add beer success

Click View beers present to see the full list of beers.

Beers present

You can also view the list of good beers in the system at http://localhost:8080/good-beers. Below is the output when using HTTPie.

$ http :8080/good-beers
HTTP/1.1 200
Content-Type: application/json
Date: Wed, 29 Aug 2018 21:58:23 GMT
Server: Apache TomEE
Transfer-Encoding: chunked
[
    {
        "id": 101,
        "name": "Kentucky Brunch Brand Stout"
    },
    {
        "id": 102,
        "name": "Marshmallow Handjee"
    },
    {
        "id": 103,
        "name": "Barrel-Aged Abraxas"
    },
    {
        "id": 104,
        "name": "Heady Topper"
    },
    {
        "id": 108,
        "name": "White Rascal"
    }
]

Build a REST API with Java EE

I showed you what this application can do, but I haven’t talked about how it’s built. It has a few XML configuration files, but I’m going to skip over most of those. Here’s what the directory structure looks like:

$ tree .
.
├── LICENSE
├── README.md
├── pom.xml
└── src
    ├── main
    │   ├── java
    │   │   └── com
    │   │       └── okta
    │   │           └── developer
    │   │               ├── Beer.java
    │   │               ├── BeerBean.java
    │   │               ├── BeerResource.java
    │   │               ├── BeerService.java
    │   │               └── StartupBean.java
    │   ├── resources
    │   │   └── META-INF
    │   │       └── persistence.xml
    │   └── webapp
    │       ├── WEB-INF
    │       │   ├── beans.xml
    │       │   └── faces-config.xml
    │       ├── beer.xhtml
    │       ├── index.jsp
    │       └── result.xhtml
    └── test
        └── resources
            └── arquillian.xml

12 directories, 16 files

The most important XML files is the pom.xml that defines dependencies and allows you to run the TomEE Maven Plugin. It’s pretty short and sweet, with only one dependency and one plugin.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.okta.developer</groupId>
    <artifactId>java-ee-rest-api</artifactId>
    <version>1.0-SNAPSHOT</version>
    <packaging>war</packaging>
    <name>Java EE Webapp with JAX-RS API</name>
    <url>http://developer.okta.com</url>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
        <maven.compiler.target>1.8</maven.compiler.target>
        <maven.compiler.source>1.8</maven.compiler.source>
        <failOnMissingWebXml>false</failOnMissingWebXml>
        <javaee-api.version>7.0</javaee-api.version>
        <tomee.version>7.1.0</tomee.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>javax</groupId>
            <artifactId>javaee-api</artifactId>
            <version>${javaee-api.version}</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.tomee.maven</groupId>
                <artifactId>tomee-maven-plugin</artifactId>
                <version>${tomee.version}</version>
                <configuration>
                    <context>ROOT</context>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>

The main entity is Beer.java.

package com.okta.developer;

import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;

@Entity
public class Beer {
    @Id
    @GeneratedValue(strategy = GenerationType.AUTO)
    private int id;
    private String name;

    public Beer() {}

    public Beer(String name) {
        this.name = name;
    }

    public int getId() {
        return id;
    }

    public void setId(int id) {
        this.id = id;
    }

    public String getName() {
        return name;
    }

    public void setName(String beerName) {
        this.name = beerName;
    }

    @Override
    public String toString() {
        return "Beer{" +
                "id=" + id +
                ", name='" + name + '\'' +
                '}';
    }
}

The database (a.k.a., datasource) is configured in src/main/resources/META-INF/persistence.xml.

<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence"
             xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
    <persistence-unit name="beer-pu" transaction-type="JTA">
        <jta-data-source>beerDatabase</jta-data-source>
        <class>com.okta.developer.Beer</class>
        <properties>
            <property name="openjpa.jdbc.SynchronizeMappings" value="buildSchema(ForeignKeys=true)"/>
        </properties>
    </persistence-unit>
</persistence>

The BeerService.java class handles reading and saving this entity to the database using JPA’s EntityManager.

package com.okta.developer;

import javax.ejb.Stateless;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.Query;
import javax.persistence.criteria.CriteriaQuery;
import java.util.List;

@Stateless
public class BeerService {

    @PersistenceContext(unitName = "beer-pu")
    private EntityManager entityManager;

    public void addBeer(Beer beer) {
        entityManager.persist(beer);
    }

    public List<Beer> getAllBeers() {
        CriteriaQuery<Beer> cq = entityManager.getCriteriaBuilder().createQuery(Beer.class);
        cq.select(cq.from(Beer.class));
        return entityManager.createQuery(cq).getResultList();
    }

    public void clear() {
        Query removeAll = entityManager.createQuery("delete from Beer");
        removeAll.executeUpdate();
    }
}

There’s a StartupBean.java that handles populating the database on startup, and clearing it on shutdown.

package com.okta.developer;

import javax.annotation.PostConstruct;
import javax.annotation.PreDestroy;
import javax.ejb.Singleton;
import javax.ejb.Startup;
import javax.inject.Inject;
import java.util.stream.Stream;

@Singleton
@Startup
public class StartupBean {
    private final BeerService beerService;

    @Inject
    public StartupBean(BeerService beerService) {
        this.beerService = beerService;
    }

    @PostConstruct
    private void startup() {
        // Top beers from https://www.beeradvocate.com/lists/top/
        Stream.of("Kentucky Brunch Brand Stout", "Marshmallow Handjee", 
                "Barrel-Aged Abraxas", "Heady Topper",
                "Budweiser", "Coors Light", "PBR").forEach(name ->
                beerService.addBeer(new Beer(name))
        );
        beerService.getAllBeers().forEach(System.out::println);
    }

    @PreDestroy
    private void shutdown() {
        beerService.clear();
    }
}

These three classes make up the foundation of the app, plus there’s a BeerResource.java class that uses JAX-RS to expose the /good-beers endpoint.

package com.okta.developer;

import javax.ejb.Lock;
import javax.ejb.Singleton;
import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import java.util.List;
import java.util.stream.Collectors;

import static javax.ejb.LockType.READ;
import static javax.ws.rs.core.MediaType.APPLICATION_JSON;

@Lock(READ)
@Singleton
@Path("/good-beers")
public class BeerResource {
    private final BeerService beerService;

    @Inject
    public BeerResource(BeerService beerService) {
        this.beerService = beerService;
    }

    @GET
    @Produces({APPLICATION_JSON})
    public List<Beer> getGoodBeers() {
        return beerService.getAllBeers().stream()
                .filter(this::isGreat)
                .collect(Collectors.toList());
    }

    private boolean isGreat(Beer beer) {
        return !beer.getName().equals("Budweiser") &&
                !beer.getName().equals("Coors Light") &&
                !beer.getName().equals("PBR");
    }
}

Lastly, there’s a BeerBean.java class is used as a managed bean for JSF.

package com.okta.developer;

import javax.enterprise.context.RequestScoped;
import javax.inject.Inject;
import javax.inject.Named;
import java.util.List;

@Named
@RequestScoped
public class BeerBean {

    @Inject
    private BeerService beerService;
    private List<Beer> beersAvailable;
    private String name;

    public String getName() {
        return name;
    }

    public void setName(String name) {
        this.name = name;
    }

    public List<Beer> getBeersAvailable() {
        return beersAvailable;
    }

    public void setBeersAvailable(List<Beer> beersAvailable) {
        this.beersAvailable = beersAvailable;
    }

    public String fetchBeers() {
        beersAvailable = beerService.getAllBeers();
        return "success";
    }

    public String add() {
        Beer beer = new Beer();
        beer.setName(name);
        beerService.addBeer(beer);
        return "success";
    }
}

You now have a REST API built with Java EE! However, it’s not secure. In the following sections, I’ll show you how to secure it using Okta’s JWT Verifier for Java, Spring Security, and Pac4j.

Add OIDC Security with Okta to Your Java REST API

You will need to create an OIDC Application in Okta to verify the security configurations you’re about to implement work. To make this effortless, you can use Okta’s API for OIDC. At Okta, our goal is to make identity management a lot easier, more secure, and more scalable than what you’re used to. Okta is a cloud service that allows developers to create, edit, and securely store user accounts and user account data, and connect them with one or multiple applications. Our API enables you to:

Are you sold? Register for a forever-free developer account today! When you’re finished, complete the steps below to create an OIDC app.

  1. Log in to your developer account on developer.okta.com.
  2. Navigate to Applications and click on Add Application.
  3. Select Web and click Next.
  4. Give the application a name (.e.g., Java EE Secure API) and add the following as Login redirect URIs:
    • http://localhost:3000/implicit/callback
    • http://localhost:8080/login/oauth2/code/okta
    • http://localhost:8080/callback?client_name=OidcClient
  5. Click Done, then edit the project and enable “Implicit (Hybrid)” as a grant type (allow ID and access tokens) and click Save.

Protect Your Java REST API with JWT Verifier

To validate JWTs from Okta, you’ll need to add Okta JWT Verifier for Java to your pom.xml.

<properties>
    ...
    <okta-jwt.version>0.3.0</okta-jwt.version>
</properties>

<dependencies>
    ...
    <dependency>
        <groupId>com.okta.jwt</groupId>
        <artifactId>okta-jwt-verifier</artifactId>
        <version>${okta-jwt.version}</version>
    </dependency>
</dependencies>

Then create a JwtFilter.java (in the src/main/java/com/okta/developer directory). This filter looks for an authorization header with an access token in it. If it exists, it validates it and prints out the user’s sub, a.k.a. their email address. If it doesn’t exist, or is in valid, an access denied status is returned.

Make sure to replace {yourOktaDomain} and {clientId} with the settings from the app you created.

package com.okta.developer;

import com.nimbusds.oauth2.sdk.ParseException;
import com.okta.jwt.JoseException;
import com.okta.jwt.Jwt;
import com.okta.jwt.JwtHelper;
import com.okta.jwt.JwtVerifier;

import javax.servlet.*;
import javax.servlet.annotation.WebFilter;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;

@WebFilter(filterName = "jwtFilter", urlPatterns = "/*")
public class JwtFilter implements Filter {
    private JwtVerifier jwtVerifier;

    @Override
    public void init(FilterConfig filterConfig) {
        try {
            jwtVerifier = new JwtHelper()
                    .setIssuerUrl("/oauth2/default")
                    .setClientId("{yourClientId}")
                    .build();
        } catch (IOException | ParseException e) {
            System.err.print("Configuring JWT Verifier failed!");
            e.printStackTrace();
        }
    }

    @Override
    public void doFilter(ServletRequest servletRequest, ServletResponse servletResponse,
                         FilterChain chain) throws IOException, ServletException {
        HttpServletRequest request = (HttpServletRequest) servletRequest;
        HttpServletResponse response = (HttpServletResponse) servletResponse;
        System.out.println("In JwtFilter, path: " + request.getRequestURI());

        // Get access token from authorization header
        String authHeader = request.getHeader("authorization");
        if (authHeader == null) {
            response.sendError(HttpServletResponse.SC_UNAUTHORIZED, "Access denied.");
            return;
        } else {
            String accessToken = authHeader.substring(authHeader.indexOf("Bearer ") + 7);
            try {
                Jwt jwt = jwtVerifier.decodeAccessToken(accessToken);
                System.out.println("Hello, " + jwt.getClaims().get("sub"));
            } catch (JoseException e) {
                e.printStackTrace();
                response.sendError(HttpServletResponse.SC_UNAUTHORIZED, "Access denied.");
                return;
            }
        }

        chain.doFilter(request, response);
    }

    @Override
    public void destroy() {
    }
}

To ensure this filter is working, restart your app and run:

mvn package tomee:run

If you navigate to http://localhost:8080/good-beers in your browser, you’ll see an access denied error.

blog/javaee-rest-api/tomee-401.png

To prove it works with a valid JWT, you can clone my Bootiful React project, and run its UI:

git clone -b okta https://github.com/oktadeveloper/spring-boot-react-example.git bootiful-react
cd bootiful-react/client
npm install

Edit this project’s client/src/App.tsx file and change the issuer and clientId to match your application.

const config = {
  issuer: '/oauth2/default',
  redirectUri: window.location.origin + '/implicit/callback',
  clientId: '{yourClientId}'
};

Then start it:

npm start

You should then be able to login at http://localhost:3000 with the credentials you created your account with. However, you won’t be able to load any beers from the API because of a CORS error (in your browser’s developer console).

Failed to load http://localhost:8080/good-beers: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:3000' is therefore not allowed access.

TIP: If you see a 401 and no CORS error, it likely means your client IDs don’t match.

To fix this CORS error, add a CorsFilter.java alongside your JwtFilter.java class. The filter below will allow an OPTIONS request, and send access-control headers back that allow any origin, GET methods, and any headers. I recommend you to make these settings a bit more specific in production.

package com.okta.developer;

import javax.servlet.*;
import javax.servlet.annotation.WebFilter;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;

@WebFilter(filterName = "corsFilter")
public class CorsFilter implements Filter {

    @Override
    public void doFilter(ServletRequest servletRequest, ServletResponse servletResponse, FilterChain chain)
            throws IOException, ServletException {
        HttpServletRequest request = (HttpServletRequest) servletRequest;
        HttpServletResponse response = (HttpServletResponse) servletResponse;
        System.out.println("In CorsFilter, method: " + request.getMethod());

        // Authorize (allow) all domains to consume the content
        response.addHeader("Access-Control-Allow-Origin", "http://localhost:3000");
        response.addHeader("Access-Control-Allow-Methods", "GET");
        response.addHeader("Access-Control-Allow-Headers", "*");

        // For HTTP OPTIONS verb/method reply with ACCEPTED status code -- per CORS handshake
        if (request.getMethod().equals("OPTIONS")) {
            response.setStatus(HttpServletResponse.SC_ACCEPTED);
            return;
        }

        // pass the request along the filter chain
        chain.doFilter(request, response);
    }

    @Override
    public void init(FilterConfig config) {
    }

    @Override
    public void destroy() {
    }
}

Both of the filters you’ve added use @WebFilter to register themselves. This is a convenient annotation, but it doesn’t provide any filter ordering capabilities. To workaround this missing feature, modify JwtFilter so it doesn’t have a urlPattern in its @WebFilter.

@WebFilter(filterName = "jwtFilter")

Then create a src/main/webapp/WEB-INF/web.xml file and populate it with the following XML. These filter mappings ensure the CorsFilter is processed first.

<?xml version="1.0" encoding="UTF-8"?>
<web-app version="3.1"
         xmlns="http://xmlns.jcp.org/xml/ns/javaee"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/web-app_3_1.xsd">

    <filter-mapping>
        <filter-name>corsFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>

    <filter-mapping>
        <filter-name>jwtFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
</web-app>

Restart your Java API and now everything should work!

Beer List in React UI

In your console, you should see messages similar to mine:

In CorsFilter, method: OPTIONS
In CorsFilter, method: GET
In JwtFilter, path: /good-beers
Hello, demo@okta.com

Using a filter with Okta’s JWT Verifier is an easy way to implement a resource server (in OAuth 2.0 nomenclature). However, it doesn’t provide you with any information about the user. The JwtVerifier interface does have a decodeIdToken(String idToken, String nonce) method, but you’d have to pass the ID token in from your client to use it.

In the next two sections, I’ll show you how you can use Spring Security and Pac4j to implement similar security. As a bonus, I’ll show you how to prompt the user to login (when they try to access the API directly) and get the user’s information.

Secure Your Java REST API with Spring Security

Spring Security is one of my favorite frameworks in Javaland. Most of the examples on this blog use Spring Boot when showing how to use Spring Security. I’m going to use the latest version – 5.1.0.RC2 – so this tutorial stays up to date for a few months.

Revert your changes to add JWT Verifier, or simply delete web.xml to continue.

Modify your pom.xml to have the necessary dependencies for Spring Security. You’ll also need to add Spring’s snapshot repositories to get the release candidate.

<properties>
    ...
    <spring-security.version>5.1.0.RC2</spring-security.version>
    <spring.version>5.1.0.RC3</spring.version>
    <jackson.version>2.9.6</jackson.version>
</properties>

<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-framework-bom</artifactId>
            <version>${spring.version}</version>
            <scope>import</scope>
            <type>pom</type>
        </dependency>
        <dependency>
            <groupId>org.springframework.security</groupId>
            <artifactId>spring-security-bom</artifactId>
            <version>${spring-security.version}</version>
            <scope>import</scope>
            <type>pom</type>
        </dependency>
    </dependencies>
</dependencyManagement>

<dependencies>
    ...
    <dependency>
        <groupId>org.springframework</groupId>
        <artifactId>spring-webmvc</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.security</groupId>
        <artifactId>spring-security-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.security</groupId>
        <artifactId>spring-security-config</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework
          Build a Basic CRUD App in Android with Kotlin      Cache   Translate Page      

Kotlin was recently given official Android support status by Google, but it remains difficult to understand for many developers. The best way to start is by creating a complete app yourself, which you’ll do in this tutorial. In this tutorial, you’ll use Spring Boot for the API that powers your Android (+ Kotlin) mobile app. Spring Boot is a great way to create a robust REST API with a minimal amount of code.

I’m going to assume you have some Java experience and have at least played around with creating an Android app. If you don’t have any Android experience you should be able to follow along but you might have to Google a few things here and there.

Here is the complete code if you rather go straight to the end.

Before we start, let’s talk a bit about Kotlin.

Kotlin vs Java

Kotlin looks strange to newcomers. It resembles other languages you may have seen but some things look off, often because it is so concise!

Don’t panic - because it is so extensible there are many ways to write the same code, and many shortcuts that aren’t available in other languages. For example, often you’ll see curly brackets used as function parameters:

dialogBuilder.setPositiveButton("Delete", { dialog, whichButton ->
    deleteMovie(movie)
})

This is actually creating an anonymous function (a lambda) and passing it in. This function takes in two parameters which are here inferred. Take a look at the equivalent (pre-JRE 8) Java code:

dialogBuilder.setPositiveButton("Delete",
    new DialogInterface.OnClickListener() {
        public void onClick(DialogInterface dialog, int which) {
            deleteMovie(movie);
        }
    }
);

(Of course, now Java 8 has lambdas too).

Here is another example of some code we will use in a bit:

class MovieViewHolder(val view: View) : RecyclerView.ViewHolder(view)

In order to understand this you have to know several things:

Declaring a class with parentheses (i.e. (view: View)) means you are declaring the class’s primary constructor (and yes - there are secondary constructors as well). The colon : is similar to implements or extends but really is about Interfaces. Anything declared in the primary constructor is automatically declared as a property (member variable).

For clarity, this is the equivalent Java:

public static class MovieViewHolder extends RecyclerView.ViewHolder {
    public final View view;
    public MovieViewHolder(View v) {
        super(v);
        view = v;
   }
}

As a last example, look at the following bean:

package demo

data class Movie( val id: Int, val name: String )

That is the complete file. It declares a class with a constructor, two read-only properties (member variables), and assigns those in the constructor. Then data creates getters and setters for all our member variables, as well as equals(), toString() and others (see here if you want to see it in it’s full Java glory).

Now that you’ve got some background, let’s get started!

Create the Spring Boot API for Your Android + Kotlin Project

The official Spring Boot tutorials suggest you use the Initializr Website to create a starting skeleton but I find it easier to build projects from scratch.

To start, initialize an empty directory with Gradle (make sure you’ve installed Gradle and that it’s available on the command line).

C:\Users\Karl\Kotlin-Spring>gradle init

BUILD SUCCESSFUL in 3s
2 actionable tasks: 2 executed
C:\Users\Karl\Kotlin-Spring>

You should have two folders and four files.

.
├── build.gradle
├── gradle
│   └── wrapper
│       ├── gradle-wrapper.jar
│       └── gradle-wrapper.properties
├── gradlew
├── gradlew.bat
└── settings.gradle

2 directories, 6 files

Now change build.gradle to the following:

buildscript {
    ext.kotlin_version = '1.2.61' // Required for Kotlin integration
    ext.spring_boot_version = '2.0.2.RELEASE'
    repositories {
        jcenter()
    }
    dependencies {
        // Required for Kotlin integration
        classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
        // See https://kotlinlang.org/docs/reference/compiler-plugins.html#kotlin-spring-compiler-plugin         
        classpath "org.jetbrains.kotlin:kotlin-allopen:$kotlin_version"
        classpath("org.jetbrains.kotlin:kotlin-noarg:$kotlin_version")
        classpath "org.springframework.boot:spring-boot-gradle-plugin:$spring_boot_version"
    }
}

// Required for Kotlin integration
apply plugin: 'kotlin'
// See https://kotlinlang.org/docs/reference/compiler-plugins.html#kotlin-spring-compiler-plugin
apply plugin: "kotlin-spring" 
apply plugin: 'kotlin-jpa'
apply plugin: 'org.springframework.boot'
apply plugin: 'io.spring.dependency-management'

jar {
    baseName = 'kotlin-demo'
    version = '0.1.0-SNAPSHOT'
}

repositories {
    jcenter()
}

dependencies {
    // Required for Kotlin integration
    compile "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
    compile "org.jetbrains.kotlin:kotlin-reflect" // For reflection
    compile 'org.springframework.boot:spring-boot-starter-data-rest'
    compile 'org.springframework.boot:spring-boot-starter-data-jpa'
    compile 'com.h2database:h2'
}

Here the Kotlin and Spring Boot plugins are imported, external repositories are declared, and dependency libraries are added.

If you haven’t used Spring Boot before you should know that it (or rather the Spring Framework) uses dependency injection at runtime. This means the entire application is wired up automatically based on the libraries you import. For example, at the end of our build.gradle you’ll see the Data REST and Data JPA libraries. Spring Boot will automatically configure your application as a REST server when it sees two these libraries. Furthermore, since you included the H2 database library Spring will use the H2 database engine to persist any of our REST data coming in and out of queries.

All you need to have a complete REST application, is to define a class with the @SpringBootApplication annotation. You don’t even need to specify it’s path - Spring will search for it!

Put the following into src/main/kotlin/demo/Application.kt:

package demo

import org.springframework.boot.SpringApplication
import org.springframework.boot.autoconfigure.SpringBootApplication

@SpringBootApplication
class Application

fun main(args: Array<String>) {
    SpringApplication.run(Application::class.java, *args)
}

Now if you run gradlew bootRun (./gradlew bootRun on *nix) everything should build (and download) and you should see somewhere in the enormous log Started Application. Now run curl in another window to see what is happening.

C:\Users\Karl>curl localhost:8080
{
  "_links" : {
    "profile" : {
      "href" : "http://localhost:8080/profile"
    }
  }
}

Amazingly, you’ve created a fully compliant REST server with Kotlin, all by editing just two files!

Add Objects with Kotlin

To create objects you just need the entity class and a repository.

Next to Application.kt put the following into Model.kt

package demo

import javax.persistence.*

@Entity
data class Movie(@Id @GeneratedValue(strategy = GenerationType.IDENTITY)
                 val Id: Long,
                 val name: String)

Here you’ve used the data idiom to create getters and setters for all the properties, as well as JPA annotations to specify how to generate the ids for your entity.

Note: The Id field must start with a capital I. If it doesn’t, the server won’t return the id field when doing queries. This will give you trouble down the line when hooking up to the client app.

Now put this into Repository.kt:

package demo

import org.springframework.data.repository.CrudRepository

interface ItemRepository : CrudRepository<Movie, Long>

And you’ve done it! Incredibly, we can now perform any CRUD operation on this server and it will work, persisting all changes to the database.

C:\Users\Karl>curl -X POST -H "Content-Type:application/json" -d " {\"name\":\"The 40 Year Old Virgin\"} " localhost:8080/movies
{
  "name" : "The 40 Year Old Virgin",
  "_links" : {
    "self" : {
      "href" : "http://localhost:8080/movies/1"
    },
    "item" : {
      "href" : "http://localhost:8080/movies/1"
    }
  }
}
C:\Users\Karl>curl localhost:8080/movies/1
{
  "name" : "The 40 Year Old Virgin",
  "_links" : {
    "self" : {
      "href" : "http://localhost:8080/movies/1"
    },
    "item" : {
      "href" : "http://localhost:8080/movies/1"
    }
  }
}

Load Initial Data in your Kotlin App

To finish up, let’s load some data. Again, as with Spring Boot everything can be done simply. Just put the following into src/main/resources/data.sql and it will be run on boot.

INSERT INTO movie (name) VALUES
  ('Skyfall'),
  ('Casino Royale'),
  ('Spectre');

To confirm it works, restart the server and run curl localhost:8080/movies.

And you are done with the back-end. Time to build out the client.

Build Your Android App with Kotlin

This will require a couple of steps: First you’ll create an empty Kotlin app with Android Studio. You’ll then create a list view (with add, edit and delete buttons) using RecyclerView, populating it with hard-coded data. Finally, you’ll use Retrofit to wire the view to the REST back-end you’ve just created.

Create a project in Android Studio. Make sure you’re using at least version Android Studio 3. Use the default values for each window except make sure you include Kotlin support. Name the project whatever you want - I called mine “Kotlin Crud”. At the end, select an Empty Activity.

When you press Play on the top icon bar you should see Hello World when you run it (you can either plug in your phone or run it on an emulator. Check online for how to set this up).

Hello World in Android

If you’ve made an Android app before using Java you’ll notice the only difference is the main activity: it’s called MainActivity.kt, not MainActivity.java, and the code looks a bit different.

package demo

import android.support.v7.app.AppCompatActivity
import android.os.Bundle

class MainActivity : AppCompatActivity() {

   override fun onCreate(savedInstanceState: Bundle?) {
       super.onCreate(savedInstanceState)
       setContentView(R.layout.activity_main)
   }
}

Here are the differences:

  1. The class is not specified as public (in Kotlin this is the default)
  2. Types are specified with a colon : - the class is of type AppCompatActivity (or rather it implements AppCompatActivity, as you would say in Java) and the savedInstanceState is of type Bundle
  3. Methods are just called fun instead of function
  4. override is not an annotation
  5. The question mark means a parameter is optional (which is not possible in Java)

The last point is one of the most talked about when discussing the importance of Kotlin vs Java: it’s one of the various ways the language ensures null safety.

Import Additional Android Libraries

You need to add extra libraries to your application’s build.gradle file: one for the recycler view (which you’ll use in a second), one for the card view, and another for the floating action button. Place these next to the others in the dependencies section.

implementation 'com.android.support:design:27.1.1'
implementation 'com.android.support:cardview-v7:27.1.1'
implementation 'com.android.support:recyclerview-v7:27.1.1'

Android Studio should ask you to Sync Now. Click that and see that everything builds without error.

Note: Make sure the version is the same as the other support libraries (e.g. appcompat-v7:27.1.1). Also, because you’ll be using built-in icons (which you should avoid doing in the future) you need to put the following into the defaultConfig section of your build.gradle as well.

vectorDrawables.useSupportLibrary = true

Add Icons in Kotlin

You’ll need some icons for buttons - one for add and another for refresh. Go to the Material Icons site and select the one you like. I’m choosing the add button half the way down. When you click on it a grey and blue download section should appear on the button left. Click the grey box Selected Icons control to open the download options. Now there should be a drop-down where you can select Android as the type.

Change drop down to Android

Change the color to white and download the PNG option. Extract the contents of the ZIP file to app/src/main (you should see the ZIP file has a res folder in it).

Now you can use the new icons in your layouts. They’re called things like baseline_add_white_36.

Finally do the same thing for the loop icon, also white.

Create the View XML

You need an XML view for each list item. Place the following into src/main/res/layout/list_item.xml.

<?xml version="1.0" encoding="utf-8"?>
<android.support.v7.widget.CardView
    xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:card_view="http://schemas.android.com/apk/res-auto"

    android:id="@+id/card_view"
    android:layout_width="match_parent"
    android:layout_height="wrap_content"
    android:layout_marginBottom="3dp"
    android:layout_marginLeft="5dp"
    android:layout_marginRight="5dp"
    android:layout_marginTop="5dp"
    android:padding="3dp"
    card_view:cardElevation="2dp"
    card_view:cardMaxElevation="2dp">

    <RelativeLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:padding="5dp">

        <TextView
            android:id="@+id/name"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:padding="5dp"
            android:text="lastname"
            android:textSize="16dp" />

        <TextView
            android:id="@+id/btnDelete"
            android:layout_width="wrap_content"
            android:layout_height="35dp"
            android:layout_alignParentRight="true"
            android:drawableLeft="@android:drawable/ic_delete"
            android:padding="5dp" />

        <TextView
            android:id="@+id/btnEdit"
            android:layout_width="wrap_content"
            android:layout_height="35dp"
            android:layout_marginRight="2dp"
            android:layout_toLeftOf="@+id/btnDelete"
            android:drawableLeft="@android:drawable/ic_menu_edit"
            android:padding="5dp" />
    </RelativeLayout>

</android.support.v7.widget.CardView>

Here you’re using a Card View which is the popular way of creating lists in Android. Almost all of the XML is layout settings to ensure proper alignment. Note the android:id values which you use to connect these to our Kotlin files. Also, I’ve used some built in Android icons for our edit and delete buttons.

Note: this is not the recommended way of doing it since those icons can change between Android Studio versions - rather download the icons as we did previously !

Now for the main activity XML. Here is what src/main/res/layout/activity_main.xml should look like.

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    xmlns:app="http://schemas.android.com/apk/res-auto">

    <android.support.v7.widget.RecyclerView
        android:id="@+id/rv_item_list"
        android:layout_width="match_parent"
        android:layout_height="match_parent" />

    <android.support.design.widget.FloatingActionButton
        android:id="@+id/fab"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:src="@drawable/baseline_add_white_36"
        android:layout_gravity="bottom|end"
        app:elevation="6dp"
        android:layout_alignParentBottom="true"
        android:layout_alignParentRight="true"
        android:layout_alignParentEnd="true"
        android:layout_margin="20dp"/>

</RelativeLayout>

It’s pretty straight forward. You’ve now got a recycle view and a floating action button inside of a relative layout and have assigned baseline_add_white_36 as the source for the button. Note that the id of the recycler view is rv_list_item (you’ll be using this soon).

Add Refresh to the Action Bar

To fill things out lets put a refresh button on the action bar. This requires a new piece of XML in res/menu/buttons.xml:

<menu xmlns:android="http://schemas.android.com/apk/res/android"
      xmlns:app="http://schemas.android.com/apk/res-auto">

    <item
        android:id="@+id/refresh"
        android:icon="@drawable/baseline_loop_white_48"
        android:title="@string/refresh"
        app:showAsAction="ifRoom"/>

</menu>

Note it has an id called refresh. Also, I’ve used the loop icon from the Android Icons site (the white variant) - you’ll have to download this as before. Also, I’m using a string from the resources so you’ll have to change res/values/strings.xml:

<resources>
    <string name="app_name">Kotlin Crud</string>
    <string name="refresh">Refresh</string>
</resources>

Display Lists in Kotlin

Now to display a list of item using our views. The canonical way of doing this is the relatively new RecyclerView which supplanted the original ListView. The basic idea of a RecyclerView is to create only enough views to show on screen - if the screen can fit five items then only five are created. As you scroll through the list these views are re-used (recycled), replacing their contents with the appropriate (new) values.

How do you get started with this ? The first thing you need is a bean. Let’s call it Movie.kt.

package demo

data class Movie( val id: Int, val name: String )

Note: for all the following classes make sure the package matches that of MainActivity.kt.

Was that not easy? Next, you need an Adapter. This is a class with three methods: one to return how many items in total are being displayed (getItemCount()), one that creates an Android View control for a particular item (onCreateViewHolder()), and one that populates an existing view with an instance of your data (onBindViewHolder()).

Put this into MovieAdapter.kt.

class MovieAdapter : RecyclerView.Adapter<MovieAdapter.MovieViewHolder>() {

    var movies: ArrayList<Movie> = ArrayList()

    init { refreshMovies() }

    class MovieViewHolder(val view: View) : RecyclerView.ViewHolder(view)

    override fun onCreateViewHolder(parent: ViewGroup,
                                    viewType: Int): MovieAdapter.MovieViewHolder {

        val view = LayoutInflater.from(parent.context)
                .inflate(R.layout.list_item, parent, false)

        return MovieViewHolder(view)
    }

    override fun onBindViewHolder(holder: MovieViewHolder, position: Int) {
        holder.view.name.text = movies[position].name
    }

    override fun getItemCount() = movies.size

    fun refreshMovies() {
        movies.clear()

        movies.add(Movie(0, "Guardians of the Galaxy"))
        movies.add(Movie(1, "Avengers: Infinity War"))
        movies.add(Movie(2,"Thor: Ragnorok"))

        notifyDataSetChanged()
    }
}

When you paste this into Android Studio it will highlight certain things as red. You need to ALT-ENTER (Option + Enter on Mac) several times to pull in the imports you need. Eventually this is the list of imports you should have:

import android.support.v7.widget.RecyclerView
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import kotlinx.android.synthetic.main.list_item.view.*

A lot is going on in MovieAdapter.kt. Besides just implementing the three methods needed to implement RecylcerView.Adapter, you created a property called movies, a list, and initialise it in the init{} constructor. Also, you declared an inner class called MovieViewHolder. That is what gets instantiated for each view needed to be displayed (in the example discussed, five views). As you can see, onCreateViewHolder actually returns an object of this type. The class is quite simple - it takes into its constructor a View (which is now also a property), and returns a Holder type object. This object is what you then use when you fill in data using onBindViewHolder - in our case, setting the text of our display.

This does seem complicated at first. It good way to look at all this is the following: how does this connect to your main code class (i.e. MainActivity.kt), and how does it connect to the views you’ve defined in XML ?

For the first part, this is what main activity should now look like:

class MainActivity : AppCompatActivity() {

    lateinit var adapter:MovieAdapter

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)

        adapter = MovieAdapter()    
    
        rv_item_list.layoutManager = LinearLayoutManager(this)
        rv_item_list.adapter = adapter
    }
}

So here you’ve defined the adapter as a lateinit variable - lateinit is telling Kotlin that you want to initialize this at some stage after creation, not the default in Kotlin classes - normally you have to initialize things immediately.

In the constructor you assign an instance of your adapter to this property (note you don’t need to use new in Kotlin) and assign two things to rv_item_list - a LayoutManager (which is used for positioning), and an Adapter (which we’ve just created).

We should talk about rv_item_list. This is just the id of a control inside of activity_main.xml, specifically the recyclerview. Normally you would need to use findViewById (a pain for Android developers) but with Kotlin you can just specify its name. When Android Studio complains about imports and you ALT-ENTER (or your platform equivalent) it will automatically import kotlinx.android.synthetic.main.activity_main.*, bringing in all the ids into the namespace.

Lastly, add the following two functions to MainActivity:

override fun onCreateOptionsMenu(menu: Menu): Boolean {
    val inflater = menuInflater
    inflater.inflate(R.menu.buttons, menu)
    return true
}

override fun onOptionsItemSelected(item: MenuItem) = when (item.itemId) {
    R.id.refresh -> {
        adapter.refreshMovies()
        Toast.makeText(this.baseContext, "Refreshed", Toast.LENGTH_LONG).show())
        true
    }
    else -> {
        super.onOptionsItemSelected(item)
    }
}

That will inflate the menu xml you defined as well as tying the button to your adapter’s refresh function (and give a convenience toast to say it worked).

That should be it ! Run our code and you should see the following.

Completed Android app with movie list

Wiring Your Android + Kotlin App Up

Next you need to replace the hard-coded values with data coming from your API server, as well as wiring the different buttons to their respective API calls. For that you will be using Square’s Retrofit library.

Start by adding the following to your build.gradle dependencies:

implementation 'com.squareup.retrofit2:retrofit:2.3.0'
implementation 'com.squareup.retrofit2:converter-gson:2.3.0'
implementation 'com.squareup.retrofit2:adapter-rxjava2:2.3.0'
implementation 'io.reactivex.rxjava2:rxandroid:2.0.1'
implementation 'com.squareup.okhttp3:logging-interceptor:3.9.1'

Now take a look at what happens when you call your server for a list of movies:

C:\Users\Karl>curl http://localhost:8080/movies
{
  "_embedded" : {
    "movies" : [ {
      "name" : "Skyfall",
      "id" : 1,
      "_links" : {
        "self" : {
          "href" : "http://localhost:8080/movies/1"
        },
        "movie" : {
          "href" : "http://localhost:8080/movies/1"
        }
      }
    }

I’ve only shown one since it’s quite long (Spring follows something called HATEOAS which adds links to json responses). As you can see, the response is wrapped in an _embedded object, and your movies come as a list in movies. You need to represent this in your Kotlin model so Retrofit can know what to expect. Change Movie.kt to this:

import com.google.gson.annotations.SerializedName

data class Movie( val id: Int, val name: String )
data class MovieList (
    @SerializedName("movies" )
    val movies: List<Movie>
)
data class MovieEmbedded (
    @SerializedName("_embedded" )
    val list: MovieList
)

Now you need to create a new class to setup Retrofit. Let’s call it MovieApiClient.kt:

import io.reactivex.Completable
import io.reactivex.Observable
import retrofit2.Retrofit
import retrofit2.adapter.rxjava2.RxJava2CallAdapterFactory
import retrofit2.converter.gson.GsonConverterFactory
import retrofit2.http.*

interface MovieApiClient {

    @GET("movies") fun getMovies(): Observable<MovieEmbedded>
    @POST("movies") fun addMovie(@Body movie: Movie): Completable
    @DELETE("movies/{id}") fun deleteMovie(@Path("id") id: Int) : Completable
    @PUT("movies/{id}") fun updateMovie(@Path("id")id: Int, @Body movie: Movie) : Completable

    companion object {

        fun create(): MovieApiClient {

            val retrofit = Retrofit.Builder()
                    .addCallAdapterFactory(RxJava2CallAdapterFactory.create())
                    .addConverterFactory(GsonConverterFactory.create())
                    .baseUrl("http://10.0.2.2:8080/")
                    .build()

            return retrofit.create(MovieApiClient::class.java)
        }
    }
}

Here you define all the endpoints using annotations as well as their expected return types (Completable, part of RxJava, just means nothing is returned). You also declare a companion object (which is like a static class) which instantiates a Retrofit builder with the details of our API. Note the base url uses the IP 10.0.2.2 which allows emulators to connect to localhost.

Now in MovieAdapter change the header to include a context property (so you can attach toasts for our API results) as well as adding a lazy client property initialized with your previous create() method.

class MovieAdapter(val context: Context) :  RecyclerView.Adapter<MovieAdapter.MovieViewHolder>() {

    val client by lazy { MovieApiClient.create() }
    var movies: ArrayList<Movie> = ArrayList()

Lazy takes in a function (note the curly brackets) and says “when someone first tries to use this property, run this function and assign it”.

To initialize the context, change the adapter initialize statement to include the main activity context:

adapter = MovieAdapter(this.baseContext)

Now change refreshMovies() in the adapter to the following:

fun refreshMovies() {
    client.getMovies()
        .subscribeOn(Schedulers.io())
        .observeOn(AndroidSchedulers.mainThread())
        .subscribe({ result ->
                movies.clear()
                movies.addAll(result.list.movies)
                notifyDataSetChanged()
        },{ error ->
                Toast.makeText(context, "Refresh error: ${error.message}", Toast.LENGTH_LONG).show()
                Log.e("ERRORS", error.message)
        })
}

So you’re using the client’s getMovies() function which is declared at the top of MovieApiClient.kt. To understand what is going on here is an entire discussion on it’s own. Basically it’s using Reactive Programming which is a new way to wi

          Math Tutor - Northern Wyoming Community College District - Sheridan, WY      Cache   Translate Page      
Recording contacts and services provided into a program database. Work with College Success/TRIO program participants in an individual setting....
From Northern Wyoming Community College District - Fri, 07 Sep 2018 00:18:34 GMT - View all Sheridan, WY jobs
          Paraprofessional - Library Media Tech - Secondary - 8 hrs/day at SJHS - Sheridan County School Disctrict 2 - Sheridan, WY      Cache   Translate Page      
Assist the library media specialist with database and spreadsheet programs dealing with budget, periodicals, etc....
From Sheridan County School Disctrict 2 - Wed, 22 Aug 2018 03:38:12 GMT - View all Sheridan, WY jobs
          GIS Technician - Wood - Sheridan, WY      Cache   Translate Page      
Update key operational and tabular database. Wood is currently recruiting for a experienced GIS Technician....
From Wood - Thu, 16 Aug 2018 21:46:21 GMT - View all Sheridan, WY jobs
          MDS Coordinator RN - Sheridan Manor - Sheridan, WY      Cache   Translate Page      
Completes electronic submission of required documentation to the State database and other entities per company policy....
From SavaSeniorCare - Sat, 11 Aug 2018 00:21:31 GMT - View all Sheridan, WY jobs
          Social Services Worker - State of Wyoming - Sheridan, WY      Cache   Translate Page      
Computer skills, including word processing and the WYCAPS database. Under close supervision, provides investigative, protective and social service intervention... $19.93 - $24.91 an hour
From State of Wyoming - Fri, 17 Aug 2018 20:50:40 GMT - View all Sheridan, WY jobs
          Accounting Clerk - State of Wyoming - Powell, WY      Cache   Translate Page      
Processes and reviews routine paperwork and /or enters data into a department log or database. Involved in routine tasks associated with A/P, A/R, including,... $12.60 - $15.75 an hour
From State of Wyoming - Sat, 18 Aug 2018 08:50:34 GMT - View all Powell, WY jobs
          Account Executive - HVAC Service Sales - LONG Building Technologies - Cody, WY      Cache   Translate Page      
Maintaining and mining CRM database of good fit locations. Throughout our 50 year history, we have worked hard to make LONG an enjoyable and rewarding place to...
From LONG Building Technologies - Tue, 19 Jun 2018 06:24:48 GMT - View all Cody, WY jobs
          REGISTRATION COORDINATOR - West Park Hospital - Cody, WY      Cache   Translate Page      
If the physician ordering is not in our database, privileges must be verified per “Non-Staff Practitioners Ordering Tests and Procedures” WPH Policies and...
From West Park Hospital - Mon, 09 Jul 2018 21:59:24 GMT - View all Cody, WY jobs
          Preservation Contractor - Silver State Asset Protection - Buffalo, WY      Cache   Translate Page      
All of our contractors are given free accounts to our online database and our mobile app. We are looking for Independent Contractors to complete property...
From Indeed - Tue, 28 Aug 2018 18:47:49 GMT - View all Buffalo, WY jobs
          Escrow & Title Officer - First American - Buffalo, WY      Cache   Translate Page      
Maintain accurate records in the escrow accounting system, customer database and FAST for processing of escrow and procurement of title. Join our team!...
From First American Corporation - Wed, 18 Jul 2018 18:43:42 GMT - View all Buffalo, WY jobs
          Asset Integrity Operator - Devon Energy - Gillette, WY      Cache   Translate Page      
Review surface equipment failure database and analyze data with intent to improve equipment reliability. Qualified applicants are considered without regard to...
From Devon Energy Corporation - Thu, 30 Aug 2018 21:16:27 GMT - View all Gillette, WY jobs
          Social Services Worker - State of Wyoming - Gillette, WY      Cache   Translate Page      
Computer skills, including word processing and the WYCAPS database. Under close supervision, provides investigative, protective and social service intervention... $19.93 - $24.91 an hour
From State of Wyoming - Wed, 29 Aug 2018 02:59:41 GMT - View all Gillette, WY jobs
          Commercial Maintenance Technician - Emcor - Gillette, WY      Cache   Translate Page      
Ability to use and operate handheld device and business systems or document management Database software, including but not limited to Microsoft office, Lotus...
From Emcor - Mon, 13 Aug 2018 23:36:01 GMT - View all Gillette, WY jobs
          America's Promise Grant Enrollment Services Counselor - Northern Wyoming Community College District - Gillette, WY      Cache   Translate Page      
Understanding of integrated database systems. The America’s Promise (AP) Grant Enrollment Services Counselor is a two year, grant funded position that is... $35,420 - $53,130 a year
From Northern Wyoming Community College District - Sat, 11 Aug 2018 00:18:49 GMT - View all Gillette, WY jobs
          Preservation Contractor - Silver State Asset Protection - Gillette, WY      Cache   Translate Page      
All of our contractors are given free accounts to our online database and our mobile app. We are looking for Independent Contractors to complete property...
From Indeed - Tue, 28 Aug 2018 18:47:13 GMT - View all Gillette, WY jobs
          Administrative Assistant - SBW & Associates, PC - Worland, WY      Cache   Translate Page      
Use computers for various applications, such as database management, word processing and processing of tax returns....
From Indeed - Tue, 04 Sep 2018 18:24:29 GMT - View all Worland, WY jobs
          Patient Financial Services Representative - Banner Health - Worland, WY      Cache   Translate Page      
Strong knowledge in the use of common office software, word processing, spreadsheet, and database software are required....
From Banner Health - Thu, 21 Jun 2018 07:16:08 GMT - View all Worland, WY jobs
          Social Services Worker - State of Wyoming - Newcastle, WY      Cache   Translate Page      
Computer skills, including word processing and the WYCAPS database. Under close supervision, provides investigative, protective and social service intervention... $19.93 - $24.91 an hour
From State of Wyoming - Wed, 29 Aug 2018 02:59:42 GMT - View all Newcastle, WY jobs
          Mine IT Specialist (Mid/Senior) - Point of Rocks, WY - PacifiCorp - Point of Rocks, WY      Cache   Translate Page      
General Purpose Support mine IT requirements including desktop and server support, database and security administration, end-user application troubleshooting...
From Pacificorp - Wed, 25 Jul 2018 14:30:09 GMT - View all Point of Rocks, WY jobs
          Software Engineer - Influitive Corporation - Toronto, ON      Cache   Translate Page      
We have a service-oriented Ruby on Rails stack, backed by PostgreSQL and NoSQL databases. Influitive is a rapidly growing Software-as-a-Service (SaaS) company...
From Influitive Corporation - Tue, 28 Aug 2018 17:53:42 GMT - View all Toronto, ON jobs
          HARRIS HILL: Database Project Manager      Cache   Translate Page      
HARRIS HILL: An international charity are looking for a Project Manager to oversee the procurement, development and implementation of a new Database for the Programmes Team for a 12 month contract. The project manager will oversee projects to ensure the desired result London (Greater)
          Compensation Customer Service Administrator / Agente au service à la clientèle - rémunération - Financial Horizons Group - Montréal, QC      Cache   Translate Page      
Enter MGA commissions in required databases. Entrer les commissions MGA dans les bases de données requises. As an MGA, Financial Horizons Group is appointed by...
From Financial Horizons Group - Fri, 31 Aug 2018 07:33:21 GMT - View all Montréal, QC jobs
          Inventory Coordinator      Cache   Translate Page      
OH-Strasburg, SGF Global is looking for a Inventory Coordinator in Strasburg, OH. Duties & Requirements: Data entry in database systems for inventory management. Ordering product and communicating with suppliers. Analyzing product requirements and distribution plans to multiple locations. Must have good communication and organizational skills. Heavy use of Microsoft Outlook, Excel, and Access programs along wit
          Sales & Catering Coordinator (3 month term) - DoubleTree by Hilton West Edmonton - Edmonton, AB      Cache   Translate Page      
Assist the Director of Sales &amp; Marketing in strategic planning of budgets and programs including advertising, public/media relations, database/direct marketing,...
From Indeed - Wed, 12 Sep 2018 21:35:38 GMT - View all Edmonton, AB jobs
          Bilingual Administrative Assistant      Cache   Translate Page      
FL-Miami, RESPONSIBILITIES: Kforce has a client that is seeking a Bilingual Administrative Assistant in Miami, Florida (FL). Responsibilities include: Maintain project database to track project status and ensure all information is current Monitor regularly for completeness and accuracy and update as necessary Organizing electronic and hard project documents Track commitment dates and produce and maintain tr
          Oracle RAC Database Administrator - McLane Advanced Technologies - Fort Lee, VA      Cache   Translate Page      
The Oracle DBA will support day to day database operations of the Property Book Unit Supply Enhanced (PBUSE) system. Specifically, managing the back-end...
From McLane Advanced Technologies - Mon, 13 Aug 2018 06:31:19 GMT - View all Fort Lee, VA jobs
          Administrative Assistant - DST Consulting Engineers - Greater Sudbury, ON      Cache   Translate Page      
Establishing and maintaining filing systems and databases, managing schedules, booking meetings and travel, maintaining and ordering office supplies, shipping... $30,000 - $40,000 a year
From DST Consulting Engineers - Wed, 12 Sep 2018 21:39:26 GMT - View all Greater Sudbury, ON jobs
          SMS Marketing Software Market By Top Companies Like Target Everyone, TextMagic, SendPulse, Teckst, CallHub, Teradata, SimplyCast, Appointment Reminder and Forecast 2023      Cache   Translate Page      
SMS Marketing Software Market By Top Companies Like Target Everyone, TextMagic, SendPulse, Teckst, CallHub, Teradata, SimplyCast, Appointment Reminder and Forecast 2023 Research N Reports has announced the addition of a market intelligence report, titled “Global SMS Marketing Software Sales Market Report 2018,” to its database. The report serves as a professional study encompassing all the important aspects of the global market.

          Sr. Software Development Engineer - Amazon.com - Seattle, WA      Cache   Translate Page      
Experience with database systems internals, query optimization, and storage systems. Amazon Web Services (AWS) is the pioneer and recognized leader in Cloud...
From Amazon.com - Wed, 06 Jun 2018 01:19:35 GMT - View all Seattle, WA jobs
          Help Desk      Cache   Translate Page      
KY-Elizabethtown, Job Description Responsibilities: Answering incoming calls daily in a polite and professional manner. Enter "ALL” inquiries into the HEAT database (Voicemail, Assigned Calls, Email). Provide assistance to other Helpdesk Technicians as needed. Provide support and troubleshooting for customers with Internet access, email, cable TV, and telephone problems, by analyzing, diagnosing and resolving probl
          Build me a excel sheet, for reports      Cache   Translate Page      
A database where i can input all the details of my customers And print a report out evrytime i need to send them a report (Budget: ₹600 - ₹1500 INR, Jobs: Database Programming, Excel, Microsoft Access, PHP, SQL)
          Build me a excel sheet, for reports      Cache   Translate Page      
A database where i can input all the details of my customers And print a report out evrytime i need to send them a report (Budget: ₹600 - ₹1500 INR, Jobs: Database Programming, Excel, Microsoft Access, PHP, SQL)
          Update: eMerchant Gateway POS (Business)      Cache   Translate Page      

eMerchant Gateway POS 2.1.20


Device: iOS Universal
Category: Business
Price: Free, Version: 2.1.12 -> 2.1.20 (iTunes)

Description:

eMerchant Inc lets you transform any Apple device into a complete checkout tool with diverse payment options, security features, and inventory management tools. Download the app today to start selling products on the go, capture payments in a retail store or process securely, anywhere in between.

This application allows you to safely and securely process cash, credit card, gift card and check transactions from your device and manage inventory directly from a built-in database. Paired with compatible card swipers and Bluetooth receipt printers, accepting secure payments for any type of business. Please note a merchant account is required to use this application.

Application Features:
• Card Swiper, Barcode Scanner, and Receipt Printer Compatibility. Securely swipe credit cards with the PaySaber Jack, PaySaber Clip, PaySaber Jr., PaySaber Go and Infinite Peripherals card readers. Wirelessly print receipts and scan barcodes with the WSP-R240i Bluetooth printer or hook up to a Star TSP143 LAN printer.
• Cash Transactions Cash is one of many acceptable payment methods within this application and capabilities that allows cash transactions to be recorded for transaction history reporting.
• Customer Management and Payments Create and manage customer accounts including preferred payment method for quick, easy and secure transactions for return customers.
• Split Orders This application supports and records orders even when split payment types are used to ensure accurate transaction history reports.
• Product Management and Inventory Control Load your product database into the application to manage inventory and generate reports.
• Order and Transaction History, Batch Management View a detailed report of orders and transactions processed on your merchant account.
• Permissions Permission options allow you to set different standards of security for users or employees utilizing your point of sale system.
• Quick Sales/Refund Against Previous Transactions Using Tokenization Quickly and securely process a quick sale or refund using the option of free tokenization.
• Supports contactless payments such as Apple Pay, Samsung Pay, and Android Pay.
• Supports EMV payments through devices such as Ingenico ICMP and Castles MP200.


CUSTOMER SERVICE SUPPORT:
• Call +1 (866) 979-0260 for Free Customer Support
• Please visit https://www.emerchant.com for More Information
• Email support@emerchant.com for email support.

What's New

New Feature:

-- Added in the ability to Request Order Quantity of a product
-- Added in the ability to set Timeout duration for MP200

Thank you for using eMerchant. We update our app regularly to make sure it works best for you.

eMerchant Gateway POS


          Service Desk Agent - Home Trust Company - Toronto, ON      Cache   Translate Page      
Become familiar with the Systems, Network, Database, Desktop Engineers, programmers, developers, and each team in the IT Department....
From Home Trust Company - Wed, 29 Aug 2018 20:33:28 GMT - View all Toronto, ON jobs
          New Issue: Business and Human Rights Journal      Cache   Translate Page      
The latest issue of the Business and Human Rights Journal (Vol. 3, no. 2, July 2018) is out. Contents include:
  • Articles
    • Tori Loven Kirkebø & Malcolm Langford, The Commitment Curve: Global Regulation of Business and Human Rights
    • Valentina Azarova, Business and Human Rights in Occupied Territory: The UN Database of Business Active in Israel’s Settlements
    • Alejo José G Sison, Virtue Ethics and Natural Law Responses to Human Rights Quandaries in Business
    • Stephen Kim Park, Social Bonds for Sustainable Development: A Human Rights Perspective on Impact Investing
  • Developments in the Field
    • Daniel Iglesias Márquez & Maria Prandi, How the Business Debate Influenced (or not) the Conflict Between Catalonia and Spain
    • Arvind Ganesan, Business and Human Rights during the Trump Era
    • Dan Bross, Fabrice Houdart, & Salil Tripathi, None of their Business? How the United Nations is Calling on Global Companies to Lead the Way on Human Rights of LGBTI people
    • Doug Cassel, The Third Session of the UN Intergovernmental Working Group on a Business and Human Rights Treaty
    • Sanyu Awori, Felogene Anumo, Denisse Cordova Montes, & Layla Hughes, A Feminist Approach to the Binding Instrument on Transnational Corporations and other Business Enterprises
    • Michel Yoboué & Jonathan Kaufman, Inside the Dirty Fuels Campaign: Lessons for Business and Human Rights

          Clerical Assistant - University of Saskatchewan - Saskatoon, SK      Cache   Translate Page      
Database software, proficiency with the Student Information System (Banner), relationship management system (RMS – RECRUIT), SiRIUS, Cisco Phone, PAWS... $21.36 - $28.84 an hour
From University of Saskatchewan - Wed, 12 Sep 2018 18:18:53 GMT - View all Saskatoon, SK jobs
          Sales & Catering Coordinator (3 month term) - DoubleTree by Hilton West Edmonton - Edmonton, AB      Cache   Translate Page      
Assist the Director of Sales &amp; Marketing in strategic planning of budgets and programs including advertising, public/media relations, database/direct marketing,...
From Indeed - Wed, 12 Sep 2018 21:35:38 GMT - View all Edmonton, AB jobs
          Major DNA service developments from MyHeritage      Cache   Translate Page      
New DNA announcements from MyHeritage (www.myheritage.com):



Announcement 1:

MyHeritage supports 23andMe V5 and Living DNA uploads

If you’ve tested your DNA already, we have good news for you, please read on. If you haven’t taken a DNA test yet, we invite you to check out the MyHeritage DNA kit which is now offered at a very affordable price.

Since 2016, MyHeritage has allowed users who have tested their DNA already to upload their DNA data from Ancestry, 23andMe and Family Tree DNA, providing DNA matches and ethnicity estimates on MyHeritage for free.

However, previously MyHeritage did not support the upload of tests based on the chip called GSA (Global Screening Array), that is used by 23andMe (V5), and by Living DNA. Recent improvements to our DNA algorithms allow us to support DNA data processed on GSA chips, and so we’re happy to update you that MyHeritage now supports 23andMe V5 and Living DNA data uploads, in addition to data uploads from all major DNA testing services, including Ancestry, 23andMe (up to V5) and Family Tree DNA (Family Finder).

Upload your DNA data to MyHeritage now — it’s fast and simple. If you upload now, you will get full access to DNA Matching, Ethnicity Estimates, our industry-leading chromosome browser, and more, for FREE.

If you manage additional DNA kits for some of your relatives, and you have their permission, upload their DNA data too, and MyHeritage will let you associate the data with the respective individuals on your family tree.

As of December 1st 2018, our DNA upload policy will change: DNA Matching will remain free for uploaded DNA data, but unlocking additional DNA features will require an extra payment for DNA files uploaded after this date. All DNA data that was uploaded to MyHeritage in the past, and all DNA data that is uploaded now and prior to December 1, 2018 will continue to enjoy full access to all DNA features for free. These uploads will be grandfathered in and will remain free.

So it’s a great idea to upload DNA files for any kits you have as soon as possible. You’ll get the following benefits:

DNA Matches
DNA Matches are other users on MyHeritage from all around the world who are likely to be your relatives based on shared DNA. MyHeritage has a very strong user base in Europe so you are likely to get more DNA Matches from Europe than on any other DNA service. This is very useful if you have ancestors from Europe.

Ethnicity Estimate
A percent breakdown of your ethnic background from among 42 ethnicity regions. You’ll learn which places your ancestors came from.

Chromosome Browser
Helps you understand how you’re related to your DNA Matches by identifying DNA segments that you share with them. MyHeritage’s Chromosome Browser is considered by many experts to be the best in the industry.

After uploading, your DNA data will be kept private and secure, and our DNA service terms are the friendliest in the industry. You remain the owner of your DNA data — not us — and you can delete your DNA data at any time.

So don't delay, and upload your DNA data to MyHeritage now, while all the DNA features are free (and they will remain free for you). If you have tested with 23andMe (any version including V5) or Living DNA, you're in luck, and you can now upload this data to MyHeritage too. You can also upload DNA data from Ancestry and Family Tree DNA’s Family Finder test. Instructions for exporting your data and uploading it to MyHeritage are provided on our upload page.

P.S. We are currently processing the large backlog of 23andMe V5 kits that have been uploaded to us in the past, and their results will be rolled out to the users gradually within the next few days.

Full blog post: full blog post here: https://blog.myheritage.com/2018/09/new-myheritage-supports-23andme-v5-and-living-dna-uploads


Announcement 2:

MyHeritage Partners with British Retailer WHSmith to Distribute DNA Kits

Tel Aviv, Israel & London, United Kingdom, September 7, 2018 — MyHeritage, Europe’s leading service for DNA testing and family history, announced today the launch of a retail partnership with WHSmith. This marks the first partnership of its kind for MyHeritage in the UK, and the first time that MyHeritage DNA tests will be available for purchase in retail stores in Europe.

Under the new partnership WHSmith distributes a unique product named MyHeritage Family History Discovery Kit, which bundles MyHeritage’s popular at-home DNA test with 3 months of access to MyHeritage’s suite of premium online genealogy services. This allows consumers to receive detailed ethnicity reports and connect with their relatives around the world through the power of DNA testing, and to utilize MyHeritage’s 9-billion-strong collection of historical records and family tree tools to embark on a journey to uncover their family history.

The distribution of the kits via local retail stores caters to the surging demand for at-home DNA testing throughout Europe, and in the UK in particular. The affordable price of the MyHeritage Family History Discovery Kit available through WHSmith, £89, makes it an ideal gift for the Christmas season ahead.

The MyHeritage DNA test is notable for its ease of use. It involves a simple 2-minute cheek swab. In addition to the DNA test, the Family History Discovery Kit comes with 3 months of access to MyHeritage’s Complete plan, which includes all family tree features and historical records on MyHeritage, seamlessly integrated with the DNA test results.

“Interest in DNA testing and family history research in the UK market has skyrocketed lately,” said Akiva Glasenberg, MyHeritage’s Business Development Manager. “We have created a unique bundled product to satisfy this need and are pleased to offer it to UK consumers through selected WHSmith High Street stores. Customers can look forward to discovering their ethnic origins and family history and making use of MyHeritage’s vast DNA database and historical record collections to make new connections with their relatives in the UK and overseas.”

The MyHeritage Family History Discovery Kits are on sale in 200 WHSmith High Street stores, as well as online via www.whsmith.co.uk.

(With thanks to Daniel Horowitz)

Chris

For my genealogy guide books, visit http://britishgenes.blogspot.co.uk/p/my-books.html, whilst details of my research service are at www.ScotlandsGreatestStory.co.uk. Further content is also published daily on The GENES Blog Facebook page at www.facebook.com/BritishGENES.
          Episode 246 - South Central US outage discussion      Cache   Translate Page      

Last week the South Central US datacenter experienced a significant outage which resulted in many Azure services and customers being impacted. Kendall, Evan and Sujit break down the outage and try to understand how Microsoft and its customers can be better prepared from such unplanned events.

Media file: https://azpodcast.blob.core.windows.net/episodes/Episode246.mp3

Preliminary RCA: https://azure.microsoft.com/en-us/status/history/

 

Other updates:

Azure Stack is now integrated with the Azure Government cloud, enabling connections to Azure Government identity, subscription, registration, billing, backup/DR, and Azure Marketplace. Azure Stack unlocks a wide range of hybrid cloud use cases for government customers, such as tactical edge and regulatory scenarios.


Now in preview, you can migrate PostgreSQL databases to Azure Database for PostgreSQL with minimal downtime by using the Azure Database Migration Service (DMS). Use the Azure CLI to provision an instance of the DMS service to perform migrations from PostgreSQL on-premises or on virtual machines to Azure Database for PostgreSQL.


Azure SQL Data Warehouse Gen2 is now available in government cloud
Azure SQL Data warehouse is a fast, flexible and secure analytics platform. The Compute Optimized Gen2 tier of Azure SQL Data Warehouse is now available in US Government cloud. We recently made the service available in US Government Virginia and US Government Arizona regions. Compute Optimized Gen2 tier, using adaptive caching and instant data movement, brings at least 5x better performance for all our customers, compared to before. To find out more, go to Azure.com/sqldw.

Azure DevOps


The single service that was Visual Studio Team Services (VSTS) is now becoming a new set of Azure DevOps services. Throughout our documentation and websites, and in the product, you'll start to notice new icons and names for Azure DevOps and each of the services within it:
 · Azure Pipelines to continuously build, test, and deploy to any platform and cloud.
 · Azure Boards for powerful work management.
 · Azure Artifacts for Maven, npm, and NuGet package feeds.
 · Azure Repos for unlimited cloud-hosted private Git repos.
 · Azure Test Plans for planned and exploratory testing.
With the launch of Azure Pipelines, we've introduced a new app to the GitHub Marketplace, refreshed a number of the experiences to help you get started, and begun to offer unlimited CI/CD minutes and 10 parallel jobs for open-source projects.

 


          Driver Talent Pro.7.1.4.22 Multilingual      Cache   Translate Page      
Driver Talent Pro.7.1.4.22 Multilingual

Driver Talent Pro 7.1.4.22 Multilingual | 15.2 Mb

Driver Talent is able to automatically download and install the latest updates for all of the drivers for all of your components. The program includes an extensive database with thousands of drivers for all kinds of devices, including printers, monitors, keyboards, sound cards, video cards and more. With Driver Talent there is no need to worry about losing drivers again. Driver Talent backup and reinstall features can save you hours of searching for and installing individual device drivers.

          Build me a excel sheet, for reports      Cache   Translate Page      
A database where i can input all the details of my customers And print a report out evrytime i need to send them a report (Budget: ₹600 - ₹1500 INR, Jobs: Database Programming, Excel, Microsoft Access, PHP, SQL)
          Senior Sales Engineer, Northwest - neo4j.com - Seattle, WA      Cache   Translate Page      
Neo4j is the world’s leading graph database and a proven data management solution for highly connected data. It’s native graph architecture is well-suited to...
From neo4j.com - Tue, 26 Jun 2018 02:08:38 GMT - View all Seattle, WA jobs
          Reply To: Saving a starting point for future sites      Cache   Translate Page      

Set up one more, but stop at the point where you start getting specific. Then export the database. You can then import that for the next one.
There are plugins for One Click Demo import.
I think there are also some for cloning your setup. (like a site template)


          Solution Architect - Data & Analytics - Neudesic LLC - Seattle, WA      Cache   Translate Page      
Azure(SQL Database, DocumentDB, SQL Data Warehouse, Table Storage, Redis Cache, Database Migration Wizard, HDInsight, Data Factory, Stream Analytics, Data Lake...
From Neudesic LLC - Mon, 02 Jul 2018 10:04:49 GMT - View all Seattle, WA jobs
          Blockchain per le assicurazioni: innovazione e rivoluzione      Cache   Translate Page      

Ma proviamo a spiegare cosa significa blockchain: database distribuiti che non hanno un’autorità centrale, all’interno dei quali vengono raccolte informazioni e dati, resi poi disponibili a chi ne necessita la fruizione o acquisizione. Il dibattito in merito al loro utilizzo è decisamente acceso, molti osservatori critici della “catena dei blocchi” ritengono che questa tecnologia una […]

L'articolo Blockchain per le assicurazioni: innovazione e rivoluzione sembra essere il primo su https://www.6sicuro.it

Seguici su Google+


          Escrow & Title Officer - First American - Buffalo, WY      Cache   Translate Page      
Maintain accurate records in the escrow accounting system, customer database and FAST for processing of escrow and procurement of title....
From First American Corporation - Wed, 18 Jul 2018 18:43:42 GMT - View all Buffalo, WY jobs
          Database Engineer - SQL, MongoDB      Cache   Translate Page      
TX-Dallas, RESPONSIBILITIES: Kforce has a client in search of a Database Engineer in Dallas, Texas (TX). Summary: The Database Engineer must be experienced in multiple databases with experience as a DBA. They are expected to manage the databases of SQL Server and MongoDB. Experience with Aurora database a plus. Responsibilities: Design and develop solutions in MongoDB, MS SQL Server or Aurora Architect and b
          Trump OKs sanctions for foreigners who meddle in elections      Cache   Translate Page      

WASHINGTON — President Donald Trump signed an executive order Wednesday authorizing sanctions against foreigners who meddle in U.S. elections, acting amid criticism that he has not taken election security seriously enough.

"We felt it was important to demonstrate the president has taken command of this issue, that it's something he cares deeply about — that the integrity of our elections and our constitutional process are a high priority to him," said national security adviser John Bolton.

In the order, the president declared a national emergency, an action required under sanctions authority, to deal with the threat of foreign meddling in U.S. elections.

The order calls for sanctioning any individual, company or country that interferes with campaign infrastructure, such as voter registration databases, voting machines and equipment used for tabulating or transmitting results. It also authorizes sanctions for engaging in covert, fraudulent or deceptive activities, such as distributing disinformation or propaganda, to influence or undermine confidence in U.S. elections.

It requires the national intelligence director to make regular assessments about foreign interference and asks the Homeland Security and Justice departments to submit reports on meddling in campaign-related infrastructure. It also lays out how the Treasury and State departments will recommend what sanctions to impose.

With the midterm elections now two months away, National Intelligence Director Dan Coats said the U.S. is not currently seeing the intensity of Russian intervention that was experienced in 2016, but he didn't rule it out. He said the U.S. is also worried about the cyber activities of China, North Korea and Iran.

Coats said Trump's order directs intelligence agencies to conduct an assessment within 45 days after an election to report any meddling to the attorney general and Department of Homeland Security. The attorney general and Department of Homeland Security then have another 45 days to assess whether sanctions should be imposed.

"This clearly is a process put in place to try to assure that we are doing every possible thing we can, first of all, to prevent any interference with our elections, to report on anything we see between now and the election, but then to do a full assessment after the election to assure the American people just exactly what may have happened or may not have happened," Coats said.

Sen. Marco Rubio, R-Fla., and Sen. Chris Van Hollen, D-Md., are pushing a bill that would prohibit foreign governments from purchasing election ads, using social media to spread false information or disrupting election infrastructure. They said Trump's order recognizes the threat, but doesn't go far enough.

The order gives the executive branch the discretion to impose sanctions for election meddling, but the bill would spell out sanctions on key economic sectors of a country that interferes. Those backing the legislation say that under the bill, a nation would know exactly what it would face if caught.

Virginia Sen. Mark Warner, ranking Democrat on the Senate intelligence committee, said the order leaves the president with broad discretion to decide whether to impose tough sanctions. "Unfortunately, President Trump demonstrated in Helsinki and elsewhere that he simply cannot be counted upon to stand up to (Russian President Vladimir) Putin when it matters," said Warner, who is sponsoring the bill.

At a July 16 news conference with Putin in Helsinki, Trump was asked if he would denounce what happened in 2016 and warn Putin never to do it again. Trump did not directly answer the question. Instead, he delivered a rambling response, including demands for investigation of Hillary Clinton's email server and his description of Putin's "extremely strong and powerful" denial of meddling.

That drew outrage from both Republican and Democrats.

Trump has pushed back, saying that no other American president has been as tough on Russia. He has cited U.S. sanctions and the expulsion of alleged Russian spies from the U.S.

Mike Rogers, former director of the National Security Agency, said he thought Trump missed an opportunity in Helsinki to publicly scold Russia for meddling. Rogers said when he used to talk to Trump about the issue, Trump would often respond to him, saying "Mike, you know, I'm in a different place."

Rogers said he would tell Trump: "Mr. President, I understand that, but I'm paid by the citizens of the nation to tell you what we think. Sir, this is not about politics, it's not about parties. It's about a foreign state that is attempting to subvert the very tenets of our structure."

In his first public comments since he retired in June, Rogers said: "That should concern us as citizens. That should concern us leaders. And if we don't do something, they (the Russians) are not going to stop."

Rogers, who spoke Tuesday night at the Hayden Center at George Mason University in Virginia, also said earlier media reports claiming Trump had asked him to publicly deny any collusion between Moscow and Trump's campaign were inaccurate.

James Clapper, the former national intelligence director who appeared with Rogers and other former intelligence officials, said he personally believes that the Russian interference did influence the outcome of the 2016 election, but didn't elaborate.

"The Russians are still at it. They are committed to undermining our system," Clapper said. "One of the things that really disturbs me is — that for whatever reason, I don't know what it is — the president's failure to dime out Putin and dime out the Russians for what they are doing."

Author(s): 

Articles

Blog Posts

78d8740c9ff045c79667f257c46b646d.jpg

President Donald Trump talks about Hurricane Florence following a briefing in the Oval Office of the White House in Washington, Tuesday, Sept. 11, 2018. (AP Photo/Susan Walsh)
Source: 
AP

          First clinical symptom as a prognostic factor in systemic sclerosis: results of a retrospective nationwide cohort study      Cache   Translate Page      
Rubio-Rivas M., Corbella X., Pestaña-Fernández M., Tolosa-Vilella C., Guillen-del Castillo A., Colunga-Argüelles D., Trapiella-Martínez L., Iniesta-Arandia N., Castillo-Palma M.J., Sáez-Comet L., Egurbide-Arberas M.V., Ortego-Centeno N., Freire M., Vargas-Hitos J.A., Ríos-Blanco J.J., Todolí-Parra J.A., Rodríguez-Carballeira M., Marín-Ballvé A., Segovia-Alonso P., Pla-Salas X., Madroñero-Vuelta A.B., Ruiz-Muñoz M., Fonollosa-Pla V., Simeón-Aznar C.P., Callejas Moraga E., Calvo E., Carbonell C., Castillo M.J., Chamorro A.J., Colunga D., Corbella X., Egurbide M.V., Espinosa G., Fonollosa V., Freire M., García Hernández F.J., González León R., Guillén del Castillo A., Iniesta N., Lorenzo R., Madroñero A.B., Marí B., Marín A., Ortego-Centeno N., Pérez Conesa M., Pestaña M., Pla X., Ríos Blanco J.J., Rodríguez Carballeira M., Rubio Rivas M., Ruiz Muñoz M., Sáez Comet L., Segovia P., Simeón C.P., Soto A., Tarí E., Todolí J.A., Tolosa C., Trapiella L., Vargas Hitos J.A., Verdejo G.
Clinical Rheumatology 2018 37:4 (999-1009)
Embase MEDLINE

The objective of the study is to determine the importance of the mode of onset as prognostic factor in systemic sclerosis (SSc). Data were collected from the Spanish Scleroderma Registry (RESCLE), a nationwide retrospective multicenter database created in 2006. As first symptom, we included Raynaud’s phenomenon (RP), cutaneous sclerosis, arthralgia/arthritis, puffy hands, interstitial lung disease (ILD), pulmonary arterial hypertension (PAH), and digestive hypomotility. A total of 1625 patients were recruited. One thousand three hundred forty-two patients (83%) presented with RP as first symptom and 283 patients (17%) did not. Survival from first symptom in those patients with RP mode of onset was higher at any time than those with onset as non-Raynaud’s phenomenon: 97 vs. 90% at 5 years, 93 vs. 82% at 10 years, 83 vs. 62% at 20 years, and 71 vs. 50% at 30 years (p < 0.001). In multivariate analysis, factors related to mortality were older age at onset, male gender, dcSSc subset, ILD, PAH, scleroderma renal crisis (SRC), heart involvement, and the mode of onset with non-Raynaud’s phenomenon, especially in the form of puffy hands or pulmonary involvement. The mode of onset should be considered an independent prognostic factor in systemic sclerosis and, in particular, patients who initially present with non-Raynaud’s phenomenon may be considered of poor prognosis.


          Database Administrator - iQmetrix - Winnipeg, MB      Cache   Translate Page      
Flexibility and the ability to adapt to an evolving environment will go a long way at iQmetrix. IQmetrix has rated among the Top 50 Best Small &amp; Medium...
From iQmetrix - Wed, 25 Jul 2018 22:31:15 GMT - View all Winnipeg, MB jobs
          Database Administrator - iQmetrix - Regina, SK      Cache   Translate Page      
Last year’s iQmetrix Odyssey trip. Top 10 Reasons to Join iQmetrix (view the full list). Our YouTube channel including numerous videos on iQmetrix and what we...
From iQmetrix - Wed, 25 Jul 2018 16:31:10 GMT - View all Regina, SK jobs
          Know about Global Background Music Market Research Report with Outlook, Strategies, Challenges, Geography Trends & Growth, Applications and Forecast 2025      Cache   Translate Page      
(EMAILWIRE.COM, September 13, 2018 ) “Global Background Music Consumption Market Report 2018-2023” newly adds in Researchformarkets.com database. This report covers leading key company profiles with information such as business overview, regional analysis, consumption, revenue and specification. Background...
          Forum Post: Remote client db / appserver connection      Cache   Translate Page      
I have a point-of-sale application (GUI for .NET) that I install for my customers via GoToAssist. All of my multi-user installs (database, application server, client networking licenses) have been configured to run on private lans. My connection string to the “server” from each client is: AppService ShopAS -DirectConnect -S 18683 -H -sessionModel Session-free. Everything works great. My most recent customer has a shop where I set up his database and Appserver along with the application code. The ubroker.properties entry for ShopAS is: [UBroker.AS.ShopAS] appserviceNameList=ShopAS autoStart=1 brkrDebuggerKeyAlias=default_server brkrLogAppend=0 brokerLogFile=C:\GSS\Shop\server\log\gsBroker.log controllingNameServer=NS1 environment=ShopAS keyAlias=default_server mqBrokerLogFile=@{WorkPath}\ShopAS.mqbroker.log mqServerLogFile=@{WorkPath}\ShopAS.mqserver.log operatingMode=State-free portNumber=18683 PROPATH=.,.\shared,.\server registerNameServer=1 srvrLogAppend=0 srvrLogFile=c:\gss\Shop\server\log\gsServer.log srvrStartupParam=-pf c:\gss\Shop\server\pf\pos-as.pf srvrStartupProc=services\startserver.p srvrStartupProcParam=-T c:\GSS\Shop\wrk -rereadnolock uuid=2158bfd799119933:-216f99cd:14861ef27da:-7e41 workDir=c:\GSS\Shop\bin My customer insists on a remote client connection. The “server” is in Nevada and the “client”, which I haven’t set up yet, is in Callifornia. All the data that I send from/to my database is in ProDataSets. I’ve never dealt with remote clients. Do I set up a VPN. Does the “server” need a static IP address that I add it to my connection string? I really appreciate any help you can give.
          [Free] 2018(Aug) Ensurepass VMware 2V0-622 Dumps with VCE and PDF 101-110      Cache   Translate Page      
Ensurepass.com : Ensure you pass the IT Exams 2018 Aug VMware Official New Released 2V0-622100% Free Download! 100% Pass Guaranteed! VMware Certified Professional 6.5 – Data Center Virtualization Question No: 101 An administrator must change the statistics level for short-term performance monitoring and wants to collect metrics for all counters, excluding minimum and maximum rollup values. What would be the statistics level? Level 3 Level 1 Level 2 Level 4 Answer: A Question No: 102 Which two statements are true regarding Auto Deploy in version 6.5? (Choose two.) Auto Deploy can be configured and managed via PowerCLI. Auto Deploy can be configured and managed via the vSphere Client. Auto Deploy can be configured and managed via the vSphere Web Client. Auto Deploy can be configured and managed via vCLI commands. Answer: A,C Question No: 103 What new feature was introduced to the Content Library in vSphere 6.5? Mount an ISO directly from the Content Library. Deploy a virtual machine from an OVF Package in a Content Library. Upload a File from a URL to a Library Item. Upload a File from a Local System to a Library Item. Answer: A Explanation: Explanation WMware vSphere 6.5 enhances the Content Library with three new options: mount an ISO from the content library, update existing templates and apply guest OS customization specifications during a VM deployment. These new features help plug holes in and expand other capabilities introduced in the original version. The first of these capabilities – mounting an ISO from the Content Library is useful because it eliminates the need to store ISO images on a data store but keeps the images readily accessible. Reference http://searchvmware.techtarget.com/answer/How-does-VMware-vSphere-65- improve-its-Content-Library Question No: 104 The administrator must back up a vCenter HA deployment. Which component must be backed up? Passive node Witness node External database Active node Answer: D Explanation: When taking backups, avoid backing up the Passive and Witness nodes. Just take a backup of the Active node. When restoring an Active node, you must remove all the cluster configuration, restore and re-create the HA cluster. Question No: 105 鈥淥neAppServer鈥?is a VM template stored in a content library named 鈥淟ibraryOne鈥? but the vSphere administrator is not able to use this template for deployment. Why is the administrator unable to deploy OneAppServer? OneAppServer must be updated before it can be deployed. LibraryOne is a subscribed library and OneAppServer is not downloaded yet. LibraryOne is published and optimized for syncing over HTTP. OneAppServer was imported from a local file on the system. Answer: C Explanation: The administrator is unable to deploy OneAppServer because LibraryOne is publish and optimized for syncing over HTTP. If the syncing is enabled, you cannot use this template for deployment. Question No: 106 Which two configurable options are available in Boot Options for a virtual machine? (Choose two.) Tools Upgrades Encryption Firmware Force BIOS setup Answer: C,D Explanation: Reference https://pubs.vmware.com/vsphere-4-esx- vcenter/index.jsp?topic=/com.vmware.vsphere.vmadmin.doc_41/vsp_vm_guide/configuring _virtual_machines/t_configure_the_boot_options.html Question No: 107 When enabling Storage DRS on a datastore cluster, which three components are enabled as a result? (Choose... Read More
          [Free] 2018(Aug) Ensurepass VMware 2V0-622 Dumps with VCE and PDF 81-90      Cache   Translate Page      
Ensurepass.com : Ensure you pass the IT Exams 2018 Aug VMware Official New Released 2V0-622100% Free Download! 100% Pass Guaranteed! VMware Certified Professional 6.5 – Data Center Virtualization Question No: 81 Which two statements are true for Predictive DRS? (Choose two.) It balances resource utilization for virtual machines with unpredictable utilization patterns. It integrates DRS with vRealize Operations Manager to balance workloads for virtual machines before resource utilization spikes occur. It balances resource utilization based on a threshold#39;s algorithm that runs each night. It determines the best placement and balance of virtual machines based on a threshold#39;s algorithm that runs each night in vCenter Server 6.5 Database. Answer: B,C Explanation: Predictive DRS using a combination of DRS and vRealize Operations Manager to predict future demand and determine when and where hot spots will occur. When future hot spots are found, Predictive DRS moves the workloads long before any contention can occur. Each night it runs its Dynamic Threshold calculations which uses sophisticated analytics to create a band of what is 鈥渘ormal鈥?for each metric/object combination Question No: 82 Which statements regarding datastore clusters meets VMware’s recommended best practices? Clusters should contain only datastores presented from the same storage array. Clusters should contain only datastores with equal hardware acceleration capability. Clusters should contain only datastores with the same capacity. Clusters should contain only datastores using the same connection method (iSCSI, FC, etc.). Answer: B Question No: 83 A vSphere Administrator must ensure that a user is able to view and manage the system configuration in vSphere Web Client. In the vCenter Single Sign-On domain, which group should the user be a part of? SystemConfiguration.BashShellAdministrators ComponentManager. Administrators Administrators SystemConfiguration.Administrators Answer: D Question No: 84 A vSphere administrator manages a cluster which includes critical and non-critical virtual machines. The cluster requires different permissions for contractors and non-contractors. How can the administrator exclude the contractor group from some of the critical VMs? Apply permission for both contractors and non-contractors on the cluster level. Apply permission for both contractors and non-contractors on the cluster level. Remove permission on the critical VMs for contractors. Remove permission for contractors on the cluster level. Apply permission on the critical VMs for non-contractors. Apply permission for both contractors and non-contractors on the VMs. Remove permission on the critical VMs for contractors. Answer: B Explanation: To exclude the contractor group simple apply permission for both contractors and non- contrators on the cluster level. After that, remove premission on the critical VMs for contractors. Question No: 85 What is a potential downside of having more than four paths per datastore? limits the storage protocols increases storage latency for VM limits the path selection policies limits the number of LUNs per host Answer: D Explanation: While using more than the required number of paths restricts the number of LUNs you can present to a host, it may also affect host resources such as CPU cycles and also every path from a host requires an initiator record and there may be a limit to... Read More
          [Free] 2018(Aug) Ensurepass VMware 2V0-622 Dumps with VCE and PDF 71-80      Cache   Translate Page      
Ensurepass.com : Ensure you pass the IT Exams 2018 Aug VMware Official New Released 2V0-622100% Free Download! 100% Pass Guaranteed! VMware Certified Professional 6.5 – Data Center Virtualization Question No: 71 Which statement applies to the vSphere Replication appliance? Only one vSphere Replication appliance can be deployed per vCenter Server instance. VMware Tools in the vSphere Replication appliance can be upgraded. A single vSphere Replication appliance can manage a maximum of 4000 replications. vSphere Replication is available only with the vSphere Essentials Plus license. Answer: A Question No: 72 When performing a vCenter Server 5.5 for Windows with Microsoft SQL Server Express database migration to vCenter Server Appliance 6.5, which will be the target database? Microsoft SQL Server Express 2012R2 Microsoft SQL Server Standard 2012R2 PostgreSQL Oracle DB 11g Answer: C Explanation: The vCenter Server 6.0 embedded Microsoft SQL Server Express database is replaced with an embedded PostgreSQL database during the upgrade to vCenter Server 6.5. The maximum inventory size that applied for Microsoft SQL Server Express still applies for PostgreSQL. Question No: 73 Which feature facilitates the sharing of templates via vCenter Server? Content Library OVF folders vApp Answer: A Question No: 74 Which three storage protocols are supported by Virtual Volumes? (Choose three.) FCIP FCoE iSCSI NFS v3 NFS v4 Answer: B,C,D Explanation: Virtual Volumes supports NFS version 3 and 4.1, iSCSI, Fibre Channel, and FCoE. Question No: 75 Which two statements are true for a vCenterServer user account that is used for vSphere Data Protection (VDP)? (Choose two.) The user account must be assigned with Administrator role. The password for the user account cannot contain spaces. The user account should be created in the Single Sign-On domain The user account cannot inherit required permissions from a group role. Answer: A,B Explanation: For the vCenter server account used for vSphere data protection, the user account must have administrator role and the password for the user account should not contain spaces. Question No: 76 vCenter Server Appliance Instance can be backed up using which Client Interface? vSphere Client Interface vSphere Web Client Interface VMware Host Client Interface Virtual Appliance Management Interface Answer: D Question No: 77 A vSphere content library administrator is attempting to unpublish the content library, but the option is grayed out as shown in the Exhibit. Which statement identifies the reason for not being able to unpublish? The content library is optimized for syncing over HTTP. A synchronization operation is in progress with this content library. There are active subscriptions on this content library. Underlying storage for this content library is not accessible. Answer: D Explanation: Since the underlying storage is not accessible for this content library, the option will be greyed out because the storage is unavailable. Question No: 78 Which Host Profile Subprofile configuration is used to configure firewall settings for ESXi hosts? Advanced Configuration Settings General System Settings Security Networking Answer: C Explanation: In the Web Client interface, you’ll find the firewall configuration under the Security and Services folder of a host profile.... Read More
          [Free] 2018(Aug) Ensurepass VMware 2V0-622 Dumps with VCE and PDF 41-50      Cache   Translate Page      
Ensurepass.com : Ensure you pass the IT Exams 2018 Aug VMware Official New Released 2V0-622100% Free Download! 100% Pass Guaranteed! VMware Certified Professional 6.5 – Data Center Virtualization Question No: 41 Which CLI command shows the physical uplink status for a vmnic? esxcli network ip connection list esxcli network ip neighbor list esxcli network nic get esxcli network nic list Answer: D Question No: 42 What is the minimum number of hosts that must contribute capacity to a non-ROBO single site VMware vSAN cluster? 1 64 3 2 Answer: C Explanation: A minimum of three hosts must contribute capavity to non-ROBO single site. Question No: 43 The administrator wants to power on VM-K2, which has a 2GHz CPU reservation. VM-M1, VM-M2, and VM-K1 are all powered on. VM-K2 is not powered on. The exhibit shows the parent and child resource reservations. If Resource Pool RP-KID is configured with an expandable reservation, which statement is true? VM-K2 will be unable to power on because there are insufficient resources. VM-K2 will be able to power on since resource pool RP-KID has 2GHz available. VM-K2 will be unable to power on because only 2GHz are reserved for RP-KID. VM-K2 will receive resource priority and will be able to power on this scenario. Answer: A Question No: 44 Which two statements correctly describe VM-Host affinity rules? (Choose two.) When there is more than one VM-Host affinity rule in a vSphere DRS cluster, the rules are applied equally. After creating a VM-Host affinity rule, its ability to function in relation to other rules is predetermined. When there is more than one VM-Host affinity rule in a vSphere DRS cluster, the rules will be ranked. After creating a VM-Host affinity rule, its ability to function in relation to other rules is not checked. Answer: A,D Explanation: If you create more than one VM-Host affinity rule, the rules are not ranked, but are applied equally. Be aware that this has implications for how the rules interact. For example, a virtual machine that belongs to two DRS groups, each of which belongs to a different required rule, can run only on hosts that belong to both of the host DRS groups represented in the rules. Question No: 45 Upgrading vCenter Server with Microsoft SQL database fails with the following error message: The DB User entered does not have the required permissions needed to install and configure vCenter Server with the selected DB. Please correct the following error(s): %s What could cause this error? incorrect ports open on SQL Server incorrect database on the SQL server incorrect compatibility mode on the SQL server incorrect permission on SQL Server database Answer: C Explanation: Reference https://pubs.vmware.com/vsphere- 50/index.jsp?topic=/com.vmware.vsphere.install.doc_50/GUID-5AA32F87-270C- 4171-8896-41A607F8F997.html Question No: 46 Which two statements about vCenter HA are correct? (Choose two.) ESXi 5.5 or later is required. vCenter HA network latency between nodes must be less than 50 ms. NFS datastore is supported. It must be deployed on a 3 ESXi host cluster with DRS enabled. Answer: A,C Question No: 47... Read More
          APDCL Online Application Process 2018 : Problems & Solutions [Discussion]      Cache   Translate Page      
A lot of our regular readers have complained that, they are facing various problems regarding online application process for the recruitment of AAO (Assistant Accounts Officer), Office cum Field Assistant, Light Vehicle Driver and Sahayak posts under APDCL, AEGCL & APGCL.



Here is a set of basic instruction issued by APDCL regarding this. Pl go through this carefully.

Candidates are required to apply online through the website www.apdcl.org and click on ‘APPLY ONLINE’ on the ‘Career’ section in the Home page of APDCL website. It will redirect to the Online Application Portal for recruitment.

Candidates are advised to apply through desktop browser only preferably Chrome, Firefox or Opera & not through smart phone or any type of mobile phone browsers. Candidates are also advised to use network connectivity of good speed (preferably more than 1 mbps) while filling up online application form.


Candidates having experience in power sector, click the button
‘Apply (experienced candidate in power sector)’
 


Candidates without any experience in power sector, click the button
‘Apply (candidate without any experience)’

Step 1: New user Registration
 

a) For Registration, click the ‘Register Now’ option. Here the candidate has to enter Mandatory fields like: Post Applied for, Preferred Company, User ID (Mobile No) & Password.
b) Please choose the mandatory fields correctly. Candidates will not be allowed to change these values once the registration is completed.
c) Please provide valid Mobile Number (User Id) at the time of registration. Verification OTP will be sent to the Mobile Number provided at the time of registration. Please note that the OTP sent will be valid only for 30 (thirty) minutes.
d) All communications will be made primarily through the registered mobile number.
e) Password should be of minimum of 8 (eight) characters.
f) After successful registration, Candidates are advised to note down their user ID (Registered Mobile Number) and password which will be required to login to the system for completing the online recruitment application.
g) Before applying online, the candidates should keep ready soft copy of scanned image of the latest passport size photograph (Size minimum of 20 KB and maximum of 100KB with Resolution minimum of 200 x 200 Pixels and maximum of 250 x 250 Pixels) and scanned signature (Size minimum of 10 KB and maximum of 100 KB with Resolution minimum of 200 x 40 Pixels and maximum of 250 x 60 Pixels), both in .jpg/.jpeg/.png format only for uploading while applying online.
h) In case the candidate forgets the Password, he/she can click on “Forgot Password?” option to reset the password after providing requisite details.

Step 2: Online Application:

 

a) After successful registration, candidates are advised to login by entering their user ID (mobile No.) and password in the login page.
b) Candidates are advised to fill up all the Mandatory (*) fields.
c) The candidate should enter his/her full name as per valid ID proof. The documents that are considered to be a valid ID proof are: Mark sheet/Pass Certificate from Govt. recognized school/college, PAN Card, Passport, Bank Passbook, Driving License, Voter ID, Aadhar Card & College ID Card.
d) After entering all the details click ‘Save and Next’.
e) Candidates must select the “Declaration” Check box to “Finish” the online application.
f) Once the candidate clicks on the “Finish” button in the Application Form Preview, the candidates cannot make any changes in the Application Form.


Step 3: Payment
 

a) Upon final submission of the Application Form, the candidate will be able to pay the Application Fee.
b) PWD candidates are exempted from paying the fee. They may directly download the Application form
c) Upon successful payment of the Application Fee only, the Registration Number will be generated and the candidate will be able to view/print the Final Application Form and Payment Receipt.
d) However, in case of “Failure” transaction,


1. If Real-time failure occurs at bank’s end, the respective bank will refund the amount paid by the candidate to the account from where it has been debited.
2. If failure occurs due to broken transaction at recruiter’s end, the amount paid by the candidate will be updated in the recruiter’s database within 3 (three) working days from the date of payment and the payment receipt will be made available in the candidate’s portal.

 
To avoid last minute rush, candidates are advised in their own interest to submit ONLINE application much before the closing date since there may be a possibility of inability/failure to log on to the website of APDCL on account of heavy load on the internet or website during last days. APDCL do not accept any responsibility for the candidates not being able to submit their application within the last day on account of aforesaid reasons or any other reason.

For any queries related to online application, candidates may send their queries to recruitment@apdcl.org


In case of if you are still facing any issue/s, we are starting a new thread through this post for discussion . Anyone may share their problem/s as well as solution/s if any.

Last date for this online application process is 25th September 2018. 

Original Advertisement:Click Here .

          The Transformation of Gerald Baumgartner      Cache   Translate Page      
Video: The Transformation of Gerald Baumgartner
Watch This Video!
Studio: Celebrity Video Distribution
Gerald Baumgartner (Randall Malin) is a fastidious, delusional man wrapped in an idealized image of a business man ultra-organized, competent, logical...or so he thinks. Pleased with himself, Gerald trips through his life of routine and nth-degree organization until one day he discovers an entirely new vision of himself inspired by the beautiful, bohemian Christiana LaTierre (Melissa Fischer).

Driven by this "new vision" Gerald sets into play a spectacular life transformation, which involves among other things engineering a seamless separation from his - "current situation" - his wife (Carolyn Koskan) - using sound principles of change management.

Armed with project plans, databases of suitors for his "replacement", and bizarre decisions skills only Gerald could conceive, he executes his plan.

The results are distasterous...and hilarious.

Stars: Carolyn Koskan, Melissa Fischer, Randall Malin

          ‫پاسخ به: آماده سازی زیرساخت تهیه Integration Tests برای ServiceLayer      Cache   Translate Page      
معادل مطلب جاری برای EF Core

برای آماده سازی دیتابیس واقعی به منظور تست جامعیت با EF Core می‌توان به شکل زیر عمل کرد:
services.AddEntityFrameworkSqlServer()
                        .AddDbContext<ProjectNameDbContext>(builder =>
                            builder.UseSqlServer(
                                $@"Data Source=(LocalDB)\MSSQLLocalDb;Initial Catalog=IntegrationTesting;Integrated Security=True;MultipleActiveResultSets=true;AttachDbFileName={FileName}"));


private static string FileName => Path.Combine(
    Path.GetDirectoryName(
        typeof(TestingHelper).GetTypeInfo().Assembly.Location),
    "IntegrationTesting.mdf");
و در نهایت برای ساخت دیتابیس قبل از اجرای تست ها، به شکل زیر می‌بایست عمل کرد:
_serviceProvider.RunScopedService<ProjectNameDbContext>(context =>
{
    context.Database.EnsureDeleted();
    context.Database.EnsureCreated();
});


          Inventory, Accounting, Payroll, POS - ERP      Cache   Translate Page      
Need an inventory and accounting system with full source code with all necessary functionalities. Optionally including POS and Payroll and other ERP requirements. For whole sale and retail businesses. Should have been developed in .NET (VB or C# or ASP.NET) preferably with SQL Server database... (Budget: $250 - $750 USD, Jobs: .NET, C# Programming, Microsoft SQL Server, Software Architecture, SQL)
          Inventory, Accounting, Payroll, POS - ERP      Cache   Translate Page      
Need an inventory and accounting system with full source code with all necessary functionalities. Optionally including POS and Payroll and other ERP requirements. For whole sale and retail businesses. Should have been developed in .NET (VB or C# or ASP.NET) preferably with SQL Server database... (Budget: $250 - $750 USD, Jobs: .NET, C# Programming, Microsoft SQL Server, Software Architecture, SQL)
          Senior Data Warehouse Consultant - Teradata - Lahore      Cache   Translate Page      
Administration, maintenance, and control of the operating system, network, and tools. Administration, maintenance and control of database optimization, capacity...
From Teradata - Tue, 11 Sep 2018 19:09:17 GMT - View all Lahore jobs
          Inventory, Accounting, Payroll, POS - ERP      Cache   Translate Page      
Need an inventory and accounting system with full source code with all necessary functionalities. Optionally including POS and Payroll and other ERP requirements. For whole sale and retail businesses. Should have been developed in .NET (VB or C# or ASP.NET) preferably with SQL Server database... (Budget: $250 - $750 USD, Jobs: .NET, C# Programming, Microsoft SQL Server, Software Architecture, SQL)
          Database Administrator - Fibernetics - Cambridge, ON      Cache   Translate Page      
You have experience troubleshooting PC hardware. At Fibernetics, our purpose is to deliver happiness and connections every day by being awesome....
From Indeed - Thu, 31 May 2018 16:06:22 GMT - View all Cambridge, ON jobs
          IDG Contributor Network: The 8 essential skills your IT team must possess      Cache   Translate Page      

Cha-cha-cha-changes. IT teams have changed and grown tremendously in the last years. From required skillset to role and responsibilities within an organization, the pressure on CIOs and IT departments to keep up has been relentless. But in this ever-changing digital age, it can be challenging to hone in on the most important skills to have. Better understand what it takes to build an all-star technical team and let these 8 essential skills be your guide.

Does your IT team have these 8 essential skills?

1. Data storage and data integrity insight

An organization is only as strong as its information, and in this day and age that means data. To maintain your competitive advantage, you need to have skilled workers who understand storage configuration, disaster recovery solutions, data governance, resilience and replication of data across the enterprise. Offering training that includes all of this is a great way to keep your organization’s vast array of documents, mailboxes, email archives, SQL databases and other critical data safe and secure.

To read this article in full, please click here


          Software Engineer (application) - Rubikloud Technologies - Toronto, ON      Cache   Translate Page      
Over the past three years, we have been able to connect with over 150,000 retail point of sales location in 10 countries and create a database of over $100...
From Rubikloud Technologies - Tue, 10 Jul 2018 23:31:29 GMT - View all Toronto, ON jobs
          Data Platform Engineer (Big Data) - Rubikloud Technologies - Toronto, ON      Cache   Translate Page      
Over the past three years, we have been able to connect with over 150,000 retail point of sales location in 10 countries and create a database of over $100...
From Rubikloud Technologies - Tue, 10 Jul 2018 23:31:29 GMT - View all Toronto, ON jobs
          Agrigenomics Market Competitive Landscape, Trends, Market Concentration Rate, Business Strategies 2023      Cache   Translate Page      
(EMAILWIRE.COM, September 13, 2018 ) “Global Agrigenomics Consumption Market Report 2018-2023” newly adds in Researchformarkets.com database. This report covers leading key company profiles with information such as business overview, regional analysis, consumption, revenue and specification. Over...
          SDSU Grant Program VISTA Member - College Ag & Bio - AmeriCorps - South Dakota      Cache   Translate Page      
The tech savvy VISTA Member (the VISTA) will create reference guides, searchable databases, templates and a variety of other support tools for SDSU Extension...
From AmeriCorps - Fri, 10 Aug 2018 06:14:29 GMT - View all South Dakota jobs
          Software Engineer that will be core member to a Fintech SME Lender by Uprise Credit      Cache   Translate Page      
Software Engineers will be the core of our online financing solution. We’re looking for someone passionate about building new applications from the ground up to develop an innovative financing solution for new economy entrepreneurs in Asia that are underserved by banks. Here at Uprise, you’ll work closely with your peers in a flat structure to share responsibilities. The technology team should move fast, celebrate great ideas, inspire testing and learning, and stretch for new solutions. This pivotal role is responsible for building the online SME lending platform from top to bottom, collaborating directly and frequently with the internal team, outsourced developers, and third party strategic partners to co-create solutions for merchant customers. The ultimate users of the product are currently targeted at online merchants who use certain types of online payment gateways. We expect to expand the user-base to offline merchants as well through partnership with other of e-payment solution providers. Requirements - 3+ years solid software development experience - Have expert knowledge in one or more languages such as Javascript, Python, Ruby, Java, C#, Golang - Knowledge in one or more popular framework such as Rails, React, Play, NodeJS. - Familiarity with databases, e.g. MySQL / MongoDB / PostgreSQL / Redis.Knowledge and experience in building online / offline payment gateway or financing solutions - Demonstrated interest and passionate in bridging the financing gap faced by SMEs and entrepreneurs in Asia - Proficient in English and Chinese (either Mandarin or Cantonese) - Must be willing to occasionally travel to Hong Kong, Singapore, Taiwan and other Southeast Asian countries.
          Sage Application Consultant - Databit - Dhoby Ghaut      Cache   Translate Page      
Good knowledge in Seagate Crystal Reports Designer and MS-SQL database. Implementation of Sage 300 or related ERP solution....
From Databit - Wed, 01 Aug 2018 10:59:50 GMT - View all Dhoby Ghaut jobs
          Application Consultant - THE WORLD MANAGEMENT PTE LTD - Singapore      Cache   Translate Page      
Good knowledge in Seagate crystal and MS.SQL database. Roles &amp; Responsibilities....
From MyCareersFuture.SG - Thu, 30 Aug 2018 06:14:00 GMT - View all Singapore jobs
          New Echo Measurement Utility Software Detects Acoustic/Line Echoes and Evaluate Intermittent Echoes      Cache   Translate Page      
The Echo Measurement Utility Software is designed for analyzing echo path delay and echo return loss of voice calls. The software can measure round-trip delay (RTD), voice quality, and noise when combined with VQuad&trade;, T1 E1 Analyzers, MAPS&trade; Emulators, or Voiceband Analyzer. The EMU automatically detects incoming degraded voice files and sends the measurement results to the database. It provides graphical display source signal, received signal, error signal, and adaptive filter...

This story is related to the following:
Software

Search for suppliers of: Measuring Software

          Database Administrator at Appzone Limited      Cache   Translate Page      
AppZone is Africa's leading provider of Integrated Banking and Payment software platforms and incidentally creator of BankOne; the world's leading cloud infrastructure for Banking and Payment processing targeted at Small and Medium financial Institutions.Job Description Microsoft SQL Server administration and maintenance functions of production and QA databases in a 24/7 mission-critical environment Participate in database server architecture and design as well as server environment recommendations for the overall infrastructure solution including storage capacity planning Responsible for database alerts, maintenance plans and scripts to acquire, store, and transform data for the company's data warehouse and its various users Manage database security and control access permissions and privileges Maintain the health, performance and integrity of the database systems Responsible for availability, reliability and integrity of business data stored in production databases Communicate regularly with the development team, business and operations team to ensure the security and confidentiality of data Resolve production database issues and implement improvements to prevent a recurrence of the issue Backup and restore databases as necessary Design and develop architecture for data warehouse including source data ETL (extract transform and load) strategy, data movement and aggregation as well as data quality strategy. Required Qualification & Skills Minimum of BSc/HND in computer science, computer engineering or any other related field. Minimum of 4years relevant work experience Strong experience on SQL Server maintenance, Advanced SQL Server concepts like indexes, stored procedures e.t.c with an emphasis on database efficiency. You must be performance driven with a proven track record Ability to work in a fast-paced environment. Good communication skills Excellent interpersonal and analytic skills
          Software Developer at Appzone Limited      Cache   Translate Page      
AppZone is Africa's leading provider of Integrated Banking and Payment software platforms and incidentally creator of BankOne; the world's leading cloud infrastructure for Banking and Payment processing targeted at Small and Medium financial Institutions.Details:Expectations Develop, Implement, and Support software products and solutions that integrate with in-house and third party systems Provide support to business analysts in the conversion of individual client business requirements into software functionality Provide assistance to relevant functional teams by identifying requirements and improvements to architectural design of new/existing applications Provide training to client end users with relevant tools and technical documentation. Lead a technical work stream as a component of a larger project Provide subject-matter expertise, customer advocacy, and analysis through all phases of the development lifecycle Communicate effectively well with internal/external parties. Required SkillsThe skills and competencies required to accomplish your career move are: Bachelor's degree or corresponding combination of education and work experience in software development. 3+ years of programming experience with Web, Windows .NET framework and C# 3+ years experience WPF applications, ASP.NET, MVC, C#, .Net, Web API, JSON, REST, and SQL Server. Good communication skills. Excellent interpersonal & analytic skills. An aptitude for analytical problem-solving Ease and ability to learn fast and solve complex problems Proficiency in object-oriented design and development using software development best practices. Experience with Microsoft SQL database design, T-SQL and stored procedure programming. Experience working on Agile teams using Agile methodologies such as SCRUM Ability to provide technical input for designs, functional specifications, and other project requirements Ability to design, and build, high quality unit tests.
          Arcsight Delivery Quality Assurance Resource Engineer, Network Security at Ecscorp Resources      Cache   Translate Page      
Ecscorp Resources is a solution engineering firm, established in the year 2001 with a cumulative of over 100 years experience. Our business is driven by passion and the spirit of friendliness; we harness the power of creativity and technology to drive innovation and deliver cutting-edge solutions to increase productivity. Our passion, experience, expertise and shared knowledge have forged us into a formidable catalyst for desirable, sustainable change and incessant growth. We strive to provide achievable solutions that efficiently and measurably support goal-focused business priorities and objectives.Duration: 3 months Detailed Description ArcSight division, is a leading global provider of Compliance and Security Management solutions that protect enterprises, education and governmental agencies. ArcSight helps customers comply with corporate and regulatory policy, safeguard their assets and processes and control risk. The ArcSight platform collects and correlates user activity and event data across the enterprise so that businesses can rapidly identify, prioritize and respond to compliance violations, policy breaches, cybersecurity attacks, and insider threats. The successful candidate for this position will work on the ArcSight R&D team. This is a hands-on position that will require the candidate to work with data collected from various network devices in combination with the various ArcSight product lines in order to deliver content that will help address the needs of all of ArcSight's customers. The ideal candidate will have a good understanding of enterprise security coupled with hands-on networking and security skills as well as an ability to write and understand scripting languages such as Perl, Python. Research, analyze and understand log sources, particularly from various devices in an enterprise network Appropriately categorize the security messages generated by various sources into the multi-dimensional ArcSight Normalization schema Write and modify scripts to parse out messages and interface with the ArcSight categorization database Work on content and vulnerability update releases Write scripts and automation to optimize various processes involved Understand content for ArcSight ESM, including correlation rules, dashboards, reports, visualizations, etc. Understand requirements to write content to address use cases based on customer requests and feedback Assist in building comprehensive, correct and useful ArcSight Connector and ESM content to ArcSight customers on schedule. Requirements Excellent knowledge of IT operations, administration and security Hands-on experience of a variety of different networking and security devices, such as Firewalls, Routers, IDS/IPS etc. Ability to examine operational and security logs generated by networking and security devices, identify the meaning and severity of them Understand different logging mechanisms, standards and formats Very strong practical Linux-based and Windows-based system administration skills Strong scripting skills using languages (Shell, Perl, Python etc), and Regex Hands-on experience of database such as MySQL Knowledge of Security Information Management solution such as ArcSight ESM Experience with a version control system (Perforce, GitHub) Advanced experience with Microsoft Excel Excellent written and verbal communication skills Must possess ability and desire to learn new technologies quickly while remaining detailed oriented Strong analytical skill and problem solving skills, multi-tasking. Pluses: Network device or Security certification (CISSP, CEH etc) Experience with application server such as Apache Tomcat Work experience in security operation center (SOC).
          Senior Treasury Officer - FITR1 at African Development Bank - AfDB      Cache   Translate Page      
African Development Bank Group (AfDB) - Established in 1964, the African Development Bank is the premier pan-African development institution, promoting economic growth and social progress across the continent. There are 80 member states, including 54 in Africa (Regional Member Countries). The Bank's development agenda is delivering the financial and technical support for transformative projects that will significantly reduce poverty through inclusive and sustainable economic growth. In order to sharply focus the objectives of the Ten Year Strategy (2013 - 2022) and ensure greater developmental impact, five major areas (High 5s), all of which will accelerate our delivery for Africa, have been identified for scaling up, namely; energy, agro-business, industrialization, integration and improving the quality of life for the people of Africa.Reference: ADB/18/177 Location: Côte d'Ivoire Grade: PL5 Position N°: 50066204 The Complex The Vice Presidency for Finance oversees the financial management of the Bank Group. This encompasses the Bank Group's treasury activities including borrowings from the capital markets and investment activities; controllership functions including financial reporting and loan administration; strategic resource mobilization and the strengthening of the non-statutory financial resources and instruments; the overall asset/liability management (ALM) for the Bank Group. The Hiring Department/Division The Treasury department is responsible for raising funds from the capital markets, managing and investing the Bank Group's liquidity and shareholders' funds, processing and settling all financial transactions and managing the institution's banking relationships. The role of the Capital Markets and Financial Operations Division is to: (1) Raise cost-effective resources from the capital markets, (2) Contribute to the Capital Increase process and administer the subscriptions of shareholders to the capital of the African Development Bank and (3) Contribute to the African Development Fund (ADF) replenishment negotiations and administer the subscriptions of donors to the ADF and Multilateral Debt Relief Initiative (MDRI). The Position Under the supervision of the Division Manager, Capital Markets & Subscriptions, the Senior Treasury Officer will : Ensure that the Bank's strategy is implemented and that funding and hedging activities are effective and in line with the Division's processes: Carefully monitor assigned capital markets to source funding opportunities, ensure that pricing is assessed and commensurate with the Bank's objectives, execute transactions in line with the Bank's ALM strategy and policies, follow up on relevant documentation, due diligence and report on trades. Monitor and report the secondary market performance of the Bank's bond issues. Contribute to working groups' discussions. Lead or participate in roadshows. Ensure that the borrowing process is smooth: Prepare a well-written and structured annual borrowing program for board approval in line with the Bank's requirements, ensure that consents are obtained for markets where funding opportunities exist, ensure that all transactions are immediately reported in the division's internal dashboard (borrowing model). Track and report the secondary market performance, flows and turnover of the Bank's bonds. Ensure that the Bank's borrowing documents (Global Debt Issuance Facility, General Information Statement), listing and SEC filing, among others, are completed and coordinate the 10b-5 due diligence exercise and any other relevant documentation process. Ensure an effective and quality investor relations program: Organize dealer days, prepare the annual roadshow program identifying investors to target and identify events that should be attended. Build and update a relevant and comprehensive database of investors. Ensure that the Bank's websites, including the green bond page and Japanese website are continuously updated. Manage subscriptions to the capital of the Bank and replenishments of ADF and MDRI: Administer the subscription process requesting, monitoring and reporting on subscriptions, payments and encashments. Prepare the relevant voting powers documents. Contribute to capital increases, replenishment or voting discussions. Administer all financial aspects of subscriptions. Manage the download/publishing of currency exchange rates vs the UA (the Bank's reporting currency), and prepare and execute the administrative hedge of the Bank. Prepare documents for technical discussions with credit and ESG rating agencies. Prepare the annual Financial Presentation of the Bank to be presented to stakeholders. Duties and Responsibilities The Senior Treasury Officer will carry out or contribute to completing the following functions: Funding and hedging activities: Identify and source attractive funding opportunities by monitoring primary market activity, utilizing cross-currency (swap) calculators, maintaining a good working relationship with dealers, being aware of market trends and developments, etc. Ensure that the pricing of any new borrowing transaction is in line with the objectives and strategy of the division and mitigate any cost of carry by liaising with the Bank's ALM and investment divisions. Ensure the seamless execution of funding transactions by being alert to market conditions, economic and issuance calendars, keeping informed of investors' preferences, ensuring that sufficient credit limits exist for settling and swapping transactions, etc. Monitor the secondary performance of the Bank's bond issuances by following market-making action and flow on electronic platforms, being aware of weekly pricing indications received from dealers, information on secondary market flows, and by maintaining a good dialogue with dealers in the Bank's bond issues. Monitor economic and capital markets rules and regulations in assigned markets to ensure the smooth implementation of the borrowing program. Lead or participate in investor roadshows and prepare and/or update a comprehensive, accurate, high quality and well-designed investor presentation. Borrowing process: Ensure that the Bank is able to borrow in assigned markets by keeping abreast of all regulatory developments, ensuring bond issuance documentation is current and up-to-date, maintaining a good and frequent dialogue with relevant major investment banks present, undertake regular investor work in order to ensure that the Bank is an approved and sought after investment proposition. Facilitate the borrowing process by preparing an open mandate when warranted and regularly update the funding grid. Facilitate analysis, tracking and reporting of: (1) borrowing transactions, by updating the borrowing database and internal league tables; (2) benchmarks, through the preparation of the daily market monitor by tracking secondary market performance using dealers' traders data, as well as Bloomberg and Tradeweb and other relevant platforms; (3) bond flows and turnover per dealer to assess liquidity of bonds and engagement of dealers, and facilitate discussions with investors. Report on transactions to senior management by preparing the weekly market monitor of the division, ALCO and Board reports. Prepare the annual borrowing program of the Bank for approval by the Board of Directors to allow the Bank to raise resources from the markets and meet its funding objectives. Borrowing documentation: Ensure that the Bank's borrowing documents (Global Debt Issuance Facility, General Information Statement), listing and SEC filing, among others, are up-to-date and coordinate the 10b-5 due diligence exercise and any other relevant documentation process. Review and validate all documents necessary to the execution of each borrowing transaction, including bond and swap term sheets, pricing supplements, and dealer accession letters, swap confirmations, etc. Markets relations: Support an effective investors relations program: Identify and advise on investors to target for roadshows or conference calls to ensure the success of bond issues: Collect investor feedback on AfDB trades, tracking their participation in peers' transactions and collating information received through back-to-office reports and discussions with dealers. Maintain an efficient database of information. Contribute to the deepening of the Bank's investor base by preparing a short quarterly newsletter to investors, providing updates on funding program and key developments. Review the Investor Presentation prepared/updated by the funding team to ensure that data is correct and that information conveyed is in line with investors' interests. Update the Bank's relevant webpages (Regular, Japanese and Green) to ensure that they respond to the needs of investors and raise the profile of the Bank with investors. Enhance the visibility and positioning of the Bank by leading the preparation of the Bank's financial presentation. Subscriptions to AfDB capital, ADF replenishments and MDRI compensation: Assist and advise senior management in shareholder discussions related to capital increases, ADF replenishments, votes and compensation schemes by providing guidance and preparing the relevant technical papers (financing framework, electoral votes, share transfer rules) and contributing to relevant discussions. Contribute to establishing an effective management of the subscriptions process by (1) ensuring that all relevant subscriptions resolutions are implemented and the procedure manual is followed and up-to-date; (2) request, monitor, acknowledge and record subscriptions, payments and encashments. Manage arrears in line with guidelines. Manage relationships with shareholders by reporting on subscriptions through the preparation of the relevant voting powers, and promptly responding to queries from shareholders and internal clients. Provide relevant information to credit rating agencies and briefs to management as and when required. Manage the entry of shareholders to AfDB and ADF. Ensure that the subscriptions system is continuously improved and its data updated on a continuous basis. Rating reviews: Effectively contribute to the credit rating review discussions by preparing presentation material and data for the Bank's funding and subscriptions activities, ensuring that rating agencies criteria are met. Engage with Environment, Social and Governance (ESG) rating agencies to ensure that the Bank gets the best rating possible on its rating activities, and initiate steps to reinforce the Bank's ESG rating. Participation in working groups: Attend any assigned working group and actively participate by reviewing and providing documents presented for discussion and clearance. Exchange rates management: Ensure that Bank Group administrative budget is hedged against currency fluctuations by preparing and implementing the Bank's administrative hedge. Ensure that exchange rates are made available on the Bank's systems on a daily basis by supervising the currency downloads while providing guidance on the improvement of the system. Selection Criteria Including desirable skills, knowledge and experience: Hold at least a Master's degree in Finance, Business Administration, Economics, Statistics or related quantitative discipline Professional certification such as Chartered Financial Analyst (CFA, ICMA, FRM) is an advantage Have a minimum of five (5) years of professional experience in the International Capital Markets, with special emphasis on bond issuance. Have practical experience of managing global bonds and/or public bonds issuance process, and using derivatives for hedging purposes Very strong working knowledge of derivative products A sound working knowledge of the operation of the organisation as a business and the role of its organisational unit Ability to develop practical and timely solutions Possess and apply knowledge and expertise in appropriate depth A good understanding of clients, markets and needs Innovation, creativity and strong problem solving ability Client orientation Strong team work & relationship management Ability to work effectively and accurately in a frequently stressful environment Command of standard computer software applications such as Word, Excel, PowerPoint Knowledge of Bloomberg, Summit/Numerix would be an advantage Ability to communicate effectively (written and oral) in English and/or French, preferably with a working knowledge of the other language
          Nigeria Security Manager at Fenix International      Cache   Translate Page      
Fenix is a next-generation, end-to-end renewable energy company that does everything from design, manufacturing, sales, financing and customer service. Fenix's flagship product, ReadyPay Power, is an expandable, lease-to-own home solar system financed through ultra-affordable instalments over Mobile Money. Fenix uses real-time transaction data to create a next-generation credit score to finance power upgrades or other life-changing loans. To date, Fenix has sold over 200,000 ReadyPay Power systems and is growing its product portfolio and geographic coverage to bring power and a wider world of financing to millions of customers by 2020.Department: Safety & Security Type: Full Time Duties Develop and implement Corporate Security & Safety Plan, including all components below, and keep plan up-to-date as Fenix grows in Nigeria and the operating context changes. Emergency Evacuation and Response Plan & Procedures Corporate Communication Policies Key contact database with national, regional, and local authorities Incident reporting & tracking system Individual tracking and transport/fleet management protocol Coordinate with local security agencies to gain a better understanding of best practices Develop and manage staffing plan to implement Corporate Security & Safety Plan; hire team or consultants as planned Plan and manage budget for implementation of Corporate Security & Safety Plan across Nigeria Manage the physical protection of Fenix facilities and assets Define, update, and perform security and safety induction trainings for all new employees and on regular basis for all staffs Keep track of different events, which may pose a security hazard to Fenix staff and property Conduct security evaluations Conduct incident investigations (e.g. fraud, theft, etc.) Supervise security staff members and drivers Manage fleet control, maintenance, and planning Assist all international travelers upon request Profile Have experience in a central intelligence office or have served as an operator inside the intelligence process, with an operational background in crisis zones Have a demonstrated ability to develop and coordinate security and safety services within a large project or business, including experience with 2-3 major governmental, non-governmental, intergovernmental, or private organizations (e.g. national security forces, UN, NATO, or major companies or NGOs) Have strong working knowledge of risk management frameworks and experience implementing them Be able to manage security teams, contractors, and relevant vendors within mandatory compliance and service standards; this includes the ability to coordinate and control armed security personnel, with proven experience monitoring rules of engagement Bring (or be able to develop) strong professional networks within government and public and private security services Have exceptional interpersonal and communication skills, leadership skills, integrity, and the ability to build and lead highly professional and effective teams Be able to manage ground information collection, analysis, and dissemination.
          Mobile/Web Developer at Hamilton Lloyd and Associates      Cache   Translate Page      
Hamilton Lloyd and Associates - Our client is an Information Technology Consulting Company. Due to internal expansion, they have decided to hire a qualified candidate to fill the position below:Job Summary The Mobile/Web Developer's role is to design, code, test, and analyze software programs and applications. This includes researching, designing, documenting, and modifying software specifications throughout the production lifecycle. The software developer will also analyze and amend software errors in a timely and accurate fashion and provide status reports as required. Job Responsibilities Design and development of mission critical mobile and web applications Deliver across the entire app life cycle -concept, design, build, deploy, test, release to app stores Build prototypes at tech scoping stage of projects. Liaise with product development team to plan new features. Work with other developers and stakeholders to create and maintain a robust framework to support the mobile and web apps. Work with the front-end designers to build the interface with focus on usability features. Create compelling mobile device and/or browser specific user interfaces and experiences. Optimize performance for the mobile and web apps. Technical Skills Required. Proven development experience in desktop and mobile web development. Excellent knowledge in information architecture, human computer interaction and mobile usability design principles Excellent programming skills in frontend technologies and frameworks like HTML, JavaScript, CSS3, JQuery, PhoneGap/Cordova and frontend MVC frameworks. Expertise in full stack JavaScript frameworks include Angular, Vue etc. Multiple years of server-side programming experience in C# and .Net Experience in database technologies like MSSQL Server, Oracle, MongoDB etc. Experience in native and hybrid Android and iOS development frameworks (e.g. Ionic 2, Xamarin framework etc.) Familiarity with designing, creating and consuming RESTful and SOAP web services. Familiarity with version control systems e.g. GIT, SVN. Familiarity with build systems (e.g. Gradle, Maven), Continuous Integration tools (e.g. Jenkins, Artifactory, Nexus). Hands on experience with cloud technologies (including Azure, Oracle Cloud, IBM Bluemix, AWS etc.) Ability to switch among multiple projects, multiple languages, and multiple IDEs in short periods of time. Exposure to Java is a plus 2 Soft Skills Required: Excellent communication and interpersonal skills. Able to work well individually as well as in a highly collaborative team. Demonstrated interest in learning new technologies. Familiarity with Agile methodologies (especially Scrum). Enterprise Software Development Lifecycle. Direct work in applications that serve a very large number of users along with handling highly secure information. Person's Specification Education: Minimum of a Bachelor's Degree in any of the Physical Sciences with a minimum of Second Class Lower Division (2.2). Master's Degree would be an added advantage Experience: Minimum of 6 years' experience in software development. Special Requirements: May be required to travel occasionally; overtime and weekend work maybe required
          Hardware Support Officer at Peugeot Automobile Nigeria (PAN)      Cache   Translate Page      
Peugeot Automobile Nigeria (PAN) Limited, which has remained a milestone in Nigeria's automobile industry, was conceived in 1969 by the then Federal Military Government under the leadership of General Yakubu Gowon. We manufacture and distribute reliable vehicles for the satisfaction of our customers and other stakeholdersRef Id: HSO092018 Job Division/Department/Unit: Industrial Division Job Level/Grade: Officer Reports To: Reports to Head, IT Key Job Responsibilities Applications Support Infrastructure and Hardware Support Network Administration and Support Database Design and Management. Education & Experience B.Sc/HND Computer Science or Computer Engineering from a recognized Institution. Minimum of 2 years experience in IT infrastructure Development and management
          Completions Manager at Prime Atlantic Cegelec Nigeria (PACE)      Cache   Translate Page      
Prime Atlantic Limited is a wholly owned Nigerian Company dedicated to effecting development in the Nigerian Oil and gas industry. It was established in 2005 and successfully partnered with Cegelec, France to establish a joint venture company, Prime Atlantic Cegelec Nigeria (PACE) in 2005.Reference Number: PAE002 Location: Lagos Contract Duration: 6 Months Job Responsibilities Manage the ICAPS database Server integrity and Hardware installation. Interface with CTR team specialists, leaders on the pre-commissioning activities. Manage, monitor, analyze and report performance of the Pre -commissioning and Commissioning works and potential issues. Is the technical reference for all matters related to his trade. Qualifications & Experience B.Sc. B.Tech or HND in Electrochemical or Instrumentation or any other technical field Minimum of 12 years experience Skills: OPERCOM / ICAPS, ESD systems. Network Power Management Systems, Network Protections Management Systems, Commissioning Engineering, Commissioning Execution. Salary Attractive and Negotiable.
          Commissioning Manager at Prime Atlantic Cegelec Nigeria (PACE)      Cache   Translate Page      
Prime Atlantic Limited is a wholly owned Nigerian Company dedicated to effecting development in the Nigerian Oil and gas industry. It was established in 2005 and successfully partnered with Cegelec, France to establish a joint venture company, Prime Atlantic Cegelec Nigeria (PACE) in 2005.Reference Number: PAE001 Location: Lagos Contract Duration: 6 Months Job Responsibilities Manage resources such as manpoweI vendors, eqiapment, tools. Approve test reports and the ICAPS database accordingly. Manage and coordinate for all disciplines and systems on the FPS0. Ensure that Commissioning Marked-up is done on relevant drawings and documents. Manage all technical problems encountered. Manage hand over Process to field operation as per project procedure. Mobilize and Manage resources identified during Commissioning preparation. Qualifications & Experience B.Sc., B.Tech or HND in Electrochemical or Instrumentation or any other technical field Minimum of 15 years experience Skills: OPERCOM / ICAPS / Commissioning Execution and start-up. Knowledge of the FPSO applied his discipline Oil & Gas Treatment and Utilities process specifically for FPSO Salary Attractive and Negotiable.
          Process Specialist at Prime Atlantic Cegelec Nigeria (PACE)      Cache   Translate Page      
Prime Atlantic Limited is a wholly owned Nigerian Company dedicated to effecting development in the Nigerian Oil and gas industry. It was established in 2005 and successfully partnered with Cegelec, France to establish a joint venture company, Prime Atlantic Cegelec Nigeria (PACE) in 2005.Reference Number: PAE006 Location: Lagos Contract Duration: 6 Months Job Responsibilities Prepare system limit drawings with regards to the discipline. Supervise the Vendor activities related to process commissioning tasks. Populate or supervise as required the ICAPS database with regards to his discipline. Populate or supervise as required the CAPS database with regards to his discipline. Participate as necessary and within the frameworks of the Process discipline in the Factory Acceptance tests of the relevant major equipment packages. Manage and coordinate with other disciplines Fire & Gas systems, ESDs and deluges tests Manage hand over process to field operation as per project procedure. Qualifications & Experience B.Sc., B.Tech or HND in Chemical or Process Engineering Minimum of 8 years experience Skills: OPERCOM / ICAPS, ESD systems. Detection and Deluge systems, lifting equipment Advanced Control Systems programming Commissioning Engineering, Commissioning Execution Knowledge of the FPSO applied his discipline Salary Attractive and Negotiable
          Cruising Systems Analyst - BC Ministry of Forests, Lands, Natural Resource Operations and Rural Development - Victoria, BC      Cache   Translate Page      
Tourism &amp; Immigration. Preference may be given to candidates with experience with cruise compilation programs and Ministry databases such as ECAS, FTA, RESULTS,... $56,479 - $64,338 a year
From Canadian Forests - Wed, 29 Aug 2018 03:13:24 GMT - View all Victoria, BC jobs
          Upcoming Public Preview Availability of PostgreSQL Online Migration in Azure Database Migration Service      Cache   Translate Page      
We are pleased to announce the upcoming public preview availability of online migration in Azure Database Migration Service (DMS). Azure DMS represents a single service that you can use for migrating data from different database engines to Azure with built-in resiliency and robustness. Using online migrations, businesses can migrate their databases to Azure while the...
          add function to database      Cache   Translate Page      
Hello, i need to add function to database if last_date less than today set ishidden value = "2" and showed in closed project + add media in edit profile. my budget is $2 Thanks. (Budget: $2 - $8 USD, Jobs: Database Programming, HTML, MySQL, PHP, SQL)
          add function to database      Cache   Translate Page      
Hello, i need to add function to database if last_date less than today set ishidden value = "2" and showed in closed project + add media in edit profile. my budget is $2 Thanks. (Budget: $2 - $8 USD, Jobs: Database Programming, HTML, MySQL, PHP, SQL)
          Owning Tesla shares about the riskiest it has ever been: options data      Cache   Translate Page      
Owning electric carmaker Tesla Inc's shares is close to the riskiest it has ever been, data by options database and analytics firm OptionMetrics showed on Wednesday.

          Account Service Specialist      Cache   Translate Page      
SC-Columbia, - Under general supervision and in accordance with established policies, procedures, state regulations, and production based quotas, researches and processes suspended new business enrollment applications and/or policy conversions using multiple core administration databases and systems - Ensures that applications meet established standards and state specific Department of Insurance regulations fo
          创业耗费百万,为何DDoS如此要命Part 1      Cache   Translate Page      
*本文原创作者:罗永浩的迷弟,本文属FreeBuf原创奖励计划,未经许可禁止转载 0×00 多年运维,不敌攻击 从05年开始做运维到现在也有13年了,干过论坛,电商,游戏,金融,直播这些业务的运维,活很杂,什么WebServer,Database,Netfilter,Docker,Xen,KVM,OpenVZ,Ceph,iSCSI,DNS, 负载均衡等等。那么多年做运维以来最让人觉得棘手和    绝望的便是DDoS攻击了,每天晚上睡觉后都怕接到电话说服务器被DDoS了,至于为什么那么恐怖,我想做过运维的人或者被攻击过的企业都应该非常明白那是一种怎么样的体验。 0×01 噩梦初醒,惶惶不可终日 第一次和DDoS攻击打交道,那是06年的冬天,第一份工作是为一家网络广告联盟公司做服务器运维。 基本上每天的任务就是看下服务器的硬盘I/O负载,数据库负载,还有是否有错误日志,很平常。 直到06年冬季的某天,公司托管在浙江绍兴电信机房的那台Dell PowerEdge 1850服务器突然无法访问,WEB和SSH都无法访问。 老板那个急啊,客服那个急啊,真的是可以说热锅上的蚂蚁,坐立不安,要知道对于一个广告联盟来说,服务器瘫痪了,站长的收入突然没了,广告主的流量突然没了,这意味着站长和广告主会流失。 于是乎,联系了绍兴电信的网维,得知服务器当时遭受了大流量的DDoS攻击,攻击规模在2Gbps左右,电信的网维为了保护机柜内其他客户服务器正常运行,封了我们服务器的外网IP地址。 为了不影响业务,最后找绍兴电信付费做的DDoS防御服务,虽然最后服务恢复了,但是服务器的网络延迟也增加了,因为DDoS    攻击一直在持续。 从那次事件后,公司所有人谈DDoS色变。 0×02  挥金如土,只为续命 15年,视频直播爆发式地发展,全民网红,进入这家公司就职半年后,公司融资2000多万,对于视频直播行业可能不多,但是对于整个公司来说已经是走向成功的第一步了。 随着公司快速的发展,直播平台的流水和日活每天都在增长,漂亮的小姐姐也越来越多。 正当大家士气十足甚至都在幻想上市的时候,没过多久,公司就遭到了重挫。 16年9月15号,那天正是中秋节,晚上应该是全家一起吃月饼赏月的时候,可是一起蓄谋已久的DDoS攻击,让大家在    公司里度过了极为煎熬的中秋漫漫长夜。 当晚6点,刚吃完晚饭,准备和家人一起去公园赏月,还没动身,就接到公司电话,要求赶到公司处理突发情况。 赶到公司时,运维同事说平台的登陆系统服务器和主播打赏系统服务器被大规模DDoS攻击,由于攻击规模较大,CDN服务商直接将域名做了回源处理,大量的    攻击流量涌入源服务器,IDC机房直接将被攻击的IP地址做了封堵处理。 询问了IDC接入服务商本次的DDoS攻击规模,得知入向的DDoS攻击流量高达200Gbps+(IDC接入服务商表示机房总接入带宽是200Gbps,这次攻击直接将机房出口打满,为了不影响其他用户,只能将被攻击的IP地址做封堵处理。) 由于本次DDoS攻击规模超过了IDC服务商的接入带宽,IDC服务商没有能力防御,于是求助于云服务商。 最后云服务商给出的高防IP报价非常之高,按天计算,300G防御的每天费用为2.5万元,按月费用为    37万元,如果攻击超出300G,费用还需要支付额外防御费用。 但是公司业务处于瘫痪状态,为了尽快恢复业务,公司开通了按天的DDoS防御服务,在预存10万的防御费用后,当晚8点,DDoS防御服务开通。 防御开通后,经过云服务商的DDoS清洗服务,攻击流量被拦截,平台暂时恢复了正常,经过一晚上的观察和沟通,最终防御了这次DDoS攻击。 此后我们遭受了更大规模的DDoS攻击,每天黑客都会发起数小时的攻击,这使得我们按天支付DDoS防御费用非常不划算,最终公司购买了    37万每月的DDoS保底防御服务。 这次案例告诉我们,有钱真的是可以为所欲为。 0×03  这世上有很多悲哀,仅仅是因为没钱 曾经在猫眼社区上看到一篇帖子,让我感触很深,那帖子标题叫做《我的老婆没钱治病,死了》 当人处于真正的底层时,连选择继续生存的资格都没有,谈什么机会去创造奇迹。 创业公司不也是如此吗?大量的创业公司在遭遇DDoS攻击的时候如同生病,来的那么突然,那么措手不及,但有多少创业公司能那么轻松的负担每月高达数十万的DDoS防御费用呢? 有多少人怀着创业梦想,创造奇迹,想改变行业,但是遇到DDoS攻击,连选择继续生存的资格都没有,这不是很可悲的事情吗? 因为DDoS防御成本太高,加上对DDoS攻击的不了解,往往会出现病急乱投医。 这是非常可怕的,犹如患了重病,而三甲医院费用极高,而选择那些号称能根治但并不靠谱的    私人和莆田系医院,最后往往因为错误的治疗和时间上的拖延导致病重,最后人财两空。 0×04  细说DDoS攻击 下面进入正题,说一下我遇到的各种DDoS攻击类型和一些缓解手段,还有防止李鬼,骗子,垃圾高防服务商的一些经验,以及教大家如何分辨高防服务的    真假和水分。 SYN Flood攻击和防御方式 老生常谈的一种DDoS攻击类型,从早期的利用TCP三次握手原理,伪造的IP源,以小博大,难以追踪,堪称经典的攻击类型。 大量的伪造源的SYN攻击包进入服务器后,系统会产生大量的SYN_RECV状态,最后    耗尽系统的SYN Backlog,导致服务器无法处理后续的TCP请求,导致服务器瘫痪。 就和上面的图片一样,服务器资源被耗尽,用户无法和服务器建立连接,攻击者目的达到。 那如何防御SYN Flood攻击呢(其实是缓解,提高一下系统的处理能力,但是只限于小攻击)? 方式1:软件防火墙和系统参数优化 (适用于SYN Flood攻击流量小于服务器接入带宽,并且服务器性能足够) 【Windows系统: 可以修改注册表来提高SYN数据包的处理能力】 进入注册表的[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters]项目 1. 启用syn攻击防护模式 (可以显著提高Windows的SYN处理能力) SynAttackProtect=2 [dword] 2. 加大TCP半开连接数的队列数量 TcpMaxHalfOpen=10000 [dword] 3. 启用动态Backlog队列长度 EnableDynamicBacklog=1 [dword] 通过修改这三处注册表信息可以防止一些小规模并且较为简单的SYN Flood攻击 【Linux系统: 修改sysctl内核参数提高SYN数据包的处理能力】 1. 开启SYN Cookies,当出现SYN等待队列溢出时,启用cookies来处理 net.ipv4.tcp_syncookies = 1 2. 增加SYN Backlog队列长度 […]
          Online Insurance Verification System in Place       Cache   Translate Page      

Oklahoma’s new electronic system to verify vehicle insurance at time of registration is now online and fully operational.
 
“By validating insurance electronically we will know right away if a driver has a valid policy in place,” said Paula Ross, Oklahoma Tax Commission spokesperson.  This process will assist law enforcement in keeping uninsured motorists off Oklahoma roads.

The system, which has been in a testing mode since October 2008, is fully operational and in use by the Department of Public Safety, tag agents and the Oklahoma Tax Commission.

Registrants will still need to present an insurance verification card when renewing vehicle registrations.  Information presented on the card will be entered into an online database which is then matched up to verify the insurance policy is valid and current.

“We encourage taxpayers to review their vehicle insurance policy to make sure that the

          Accountant - Canadian Mental Health Association, BC Division - Vancouver, BC      Cache   Translate Page      
Maintain payroll database, perform payroll processing and year-end payroll preparations. Develop full accounting cycle for fee-for-service projects – includes...
From CharityVillage.com - Wed, 12 Sep 2018 22:39:54 GMT - View all Vancouver, BC jobs
          Database Engineer      Cache   Translate Page      
CA-Sunnyvale, Requirements: 5 years of Database Engineer experience Oracle Database (SQL queries, Architecture) noSQL - Cassandra, Couchbase, mongoDB (architecture/operations) RDBMS: Oracle Able to do database technology evaluation/POCs Experience on Automation and scripting (shell scripts/python) Automate database related monitoring/Operations and repetitive tasks. Build Tools Write APIs if required
          Document Review Paralegal      Cache   Translate Page      
VA-Richmond, Prime Legal is working with a prestigious firm in downtown Richmond to find a Document Review Paralegal to join their firm on a long-term, temporary basis. Qualified candidates will: be tech savvy and able to manage multiple databases have e-discovery experience be flexible, professional and hard-working! JD preferred Please apply ASAP to discuss this great opportunity! Francie Hiles Legal Recruit
          Sr Web Developer - Aegion Corporation - Houston, TX      Cache   Translate Page      
Creating database schemas that represent and support business processes. The candidate must have strong communications skills with the ability to create plans...
From Aegion Corporation - Wed, 05 Sep 2018 15:38:11 GMT - View all Houston, TX jobs
          SQLPro Studio 1.0.181 – Powerful database manager.      Cache   Translate Page      
SQLPro Studio is the premium database management tool for Postgres, MySQL, Microsoft Management Studio and Oracle databases. Features Intellisense/SQL auto-completion Syntax highlighting with customizable themes (including dark) Tab-based interface for an optimal user experience Context aware database tree navigation, including quick access to tables, views, columns, indices, and much more SQL beautifier/formatter Database-wide searching NTLMv2 […]
          Inside Technical Sales Representative - W.C. Branham - River Falls, WI      Cache   Translate Page      
Previous experience with ERP/Manufacturing system, CAD, MS office products, email and database systems such as HubSpot a plus....
From W.C. Branham - Sun, 10 Jun 2018 08:17:02 GMT - View all River Falls, WI jobs
          RE: AX 2012 POS database size is increasing day by day      Cache   Translate Page      

In case you are not taking the backups, check the SQL database settings. If the recovery model is set to simple, it will not create the logs. To get rid of the logs, you need to perform a database backup first.

See also the next article: docs.microsoft.com/.../recovery-models-sql-server


          RE: Mandatory property set as yes not working for a form control which is not linked to data source      Cache   Translate Page      

Hi Palak,

In case it is an unbound control (as I understand from the title) , you have to write your own validation logic. The mandatory part will be checked upon saving a record for database fields.


          Clerical Assistant - University of Saskatchewan - Saskatoon, SK      Cache   Translate Page      
Database software, proficiency with the Student Information System (Banner), relationship management system (RMS – RECRUIT), SiRIUS, Cisco Phone, PAWS... $21.36 - $28.84 an hour
From University of Saskatchewan - Wed, 12 Sep 2018 18:18:53 GMT - View all Saskatoon, SK jobs
          Local User Accounts Reports Alerts      Cache   Translate Page      
Shows local user accounts with all account details, such as Windows local account name, whether local account is disabled or enabled, account password expiration status and more. This free tool also allows you to export local user accounts, find all local Administrator accounts and list all local user accounts stored in the SAM database. Itapos;apos;s way of centrally listing all users instead of using "net user" command or running PowerShell to get local users (WMI query for Win32_UserAccount) on multiple domain computers at a time. The report lists all Windows user accounts and allows you to create alerts when new Windows users are created, local Administrator accounts are enabled and more.
          Database Administrator - iQmetrix - Winnipeg, MB      Cache   Translate Page      
Flexibility and the ability to adapt to an evolving environment will go a long way at iQmetrix. IQmetrix has rated among the Top 50 Best Small &amp; Medium...
From iQmetrix - Wed, 25 Jul 2018 22:31:15 GMT - View all Winnipeg, MB jobs
          Database Administrator - iQmetrix - Regina, SK      Cache   Translate Page      
Last year’s iQmetrix Odyssey trip. Top 10 Reasons to Join iQmetrix (view the full list). Our YouTube channel including numerous videos on iQmetrix and what we...
From iQmetrix - Wed, 25 Jul 2018 16:31:10 GMT - View all Regina, SK jobs
          Database Developer Sr (MS SQL Server/ Oracle) - PS4336      Cache   Translate Page      
VA-Norfolk, Description Your Talent. Our Vision. At Anthem, Inc., it's a powerful combination, and the foundation upon which we're creating greater access to care for our members, greater value for our customers, and greater health for our communities. Join us and together we will drive the future of health care. This is an exceptional opportunity to do innovative work that means more to you and those we serv
          Patient Accounting Representative - (Wailuku, Hawaii, United States)      Cache   Translate Page      
Under indirect supervision, processes insurance claims, reports and billing for compensation of patients and members for medical disability benefits; processes applications from medical/life insurance, supplemental benefits and assigned accounts; obtains background information; makes arrangements to obtain monies owing; performs other collection responsibilities as needed; abides by state collection and credit regulations; interprets and complies with state/federal regulations, laws and guidelines in reference to third party payers; processes VRs for billing; maintains current knowledge of Kaiser Health Plan benefits and policies; acts as Kaiser representative.


Essential Responsibilities:
  • Receives, reviews, and controls requests for medical information, visit records, nurse/doctor notes and other pertinent documents; verifies completeness and accuracy; ensures efficiency in processing of claims; obtains medical charts and other data pertaining to request.
  • Audits, abstracts, and summarizes pertinent data from patient medical records, nurse/doctor notes and other documents; processes insurance claims and reports in compliance with state/federal regulations, laws, guidelines and Kaiser Health Plan policies; obtains physician signature and/or signs as provider representative; prepares service charge letters and invoices referring to fee schedule.
  • Performs follow-up with insurance companies, agencies, and/or patients; researches and takes action as required.
  • Prepares and audits visit records and nurse/doctor notes using various fee schedules; prepares documents (e.g. charges, payments, adjustments) with Charge Description Master codes, required billing coding conventions, and batch totals.
  • Communicates and corresponds effectively with insurance carriers, intermediaries, members, doctors, outside providers and patients; provides in-service orientation to other departments/personnel; obtains complete and valid information; ensures collectability and maximum reimbursement of revenues.
  • Maintains familiarity and open communication with state, federal and community agencies, insurance carriers, intermediaries and others within Kaiser organization; ensures proper and adequate exchange/interpretation of data and information; ensures maximization of payments.
  • Collects monies owing for all inpatient/outpatient services rendered from third party payers, employers, patients, guarantors.
  • Contacts debtor by telephone and/or correspondence; arranges for collection of monies owing; abides by state and federal collection and credit rules and regulations; schedules interviews with debtors; maintains positive customer relationships through effective communication and follow-up.
  • Analyzes history of delinquent accounts; determines whether account is collectible; prepares write-off accounts; attaches pertinent information to assist outside attorney; submits to supervisor for review; summarizes monthly write-off report (e.g. breakdown of amounts, service locations, date of service, H.P./N.P. and over-all reason for assignment to collection agency).
  • Generates appropriate adjustments; researches documents and all other available sources to determine validity of adjustment; submits back-up to data entry for key punching function.
  • Documents and records on HPMS computer system all collection action taken on individual accounts (e.g. data collection contact made, data of promised payment, credit arrangements, insurance filing date and all other pertinent information); maintains tickler system to monitor promised payment.
  • Skip traces all mail returns; researches internal and external documents and all other sources to determine whereabouts of debtor; generates file maintenance to update patient demographics for rebilling when different address obtained.
    Basic Qualifications:
    Experience
  • Minimum one (1) year collections or medical insurance claims processing experience.
    Education
  • High school diploma.
    License, Certification, Registration
  • N/A.


    Additional Requirements:
  • Demonstrated ability to perform diversified clerical functions and basic accounting procedures.
  • Demonstrated ability to motivate debtors to pay.
  • Demonstrated knowledge of and skill in adaptability, change management, conflict resolution, customer service, influence, interpersonal relations, oral communication, problem solving, teamwork, and written communication.


    Preferred Qualifications:
  • Two (2) years collections experience in healthcare field.
  • Knowledge of medical terminology, CPT-4 and ICD-9-CM coding.
  • Knowledge of mainframe collections applications.
  • 10-key by touch.
  • Demonstrated knowledge of and skill in word processing, spreadsheet, and database PC applications.
  • Post high school coursework in accounting.

  • Experience billing for government payers (Medicare, Medicaid, etc.) highly desirable.


  •           Dec 13, 2018: The WIRE Project at Cornell: A Crucible for Student Research in the Emerging Field of Computational Art History at Boyce Thompson Institute      Cache   Translate Page      

    Sparked by prior research into matching manufactured patterns in historic papers, and motivated by a desire to bring together engineering and humanities students to generate novel research useful in the prints and drawings field, the WIRE project has since 2015 worked toward the objective of developing an online identification tool for the watermarks on Rembrandt’s etching papers, based on the concept of the decision tree. WIRE has proceeded with important oversight from colleagues at the Rijksmuseum, Amsterdam, the Metropolitan Museum of Art, and New York University’s Institute of Fine Arts, and engaged curators and paper conservators at over a dozen northeast institutions, assessing their Rembrandt watermarks with the aim of uniting them in an expanding database. Johnson and Weislogel will discuss the WIRE project’s motivation, background, and technical concept, describe its development through a succession of research seminars and its role in a recent traveling Rembrandt print exhibition, and share some of the project’s results and offshoots. [NOTE TIME CHANGE: come anytime after 2:30 for refreshments]

    View on site | Email this event


              Financial Consultant - (Pleasanton, California, United States)      Cache   Translate Page      
    Seeking a team oriented Financial Consultant who will be responsible for providing financial support to Kaiser Permanente-s Information Technology Group (KP-IT). Must be a highly organized professional with proven accounting and analytical skills. Primary focus is the proper and timely recording of financial transactions in the general ledger in accordance with GAAP. Responsible for account analysis, account reconciliations, resolving reconciling items utilizing problem solving skills. Ability to work independently and with other team members or departments to resolve accounting problems/issues. Responsible for meeting monthly due dates to ensure accurate and timely reporting of KP-IT financial data.

    Essential Functions:
  • Responsible for the proper and timely recording of financial transactions in the general ledger in accordance with GAAP and Kaiser Policy.
  • Perform monthly analysis and reconciliation of balance sheet accounts, including inter-regional reconciliations.
  • Perform monthly and ad hoc reporting to Program Office (PO), KP-IT management and KP-IT finance community in an accurate and timely manner.
  • Perform detailed variance analysis on multiple tasks/projects.
  • Create and generate financial reports.
  • Perform responsibilities in a team environment.
  • Document and maintain financial desktop procedures/systems.
  • Research and prepare accounting solutions for a variety of problems of moderate scope and complexity.
  • Act as information resource (SME) to the finance community.
  • Assist in audits.
  • Follow internal controls and SOX requirements.
  • Lead Process Improvement Initiatives.
  • Identifies key business issues and designs analytical approaches/solutions.
  • Provides on-going coaching, enabling team members to develop and improve skills and capabilities that support the effectiveness of the department/function.
  • Serves as a technical/professional mentor to team members.
  • May have formal supervisory operational responsibilities.
  • All other duties as assigned by Manager.

    Basic Qualifications:
    Experience
  • Minimum eight (8) years of financial analysis or related experience.
    Education
  • Bachelor's degree, OR four (4) years of experience in a directly related field.
  • High School Diploma or General Education Development (GED) required.
    License, Certification, Registration
  • N/A

    Additional Requirements:
  • Regularly contributes to the development of new financial analysis concepts, techniques, and standards.
  • Considered expert in field within KP.
  • Frequently contributes to the development of new financial analysis theories and methods.
  • Expert proficiency in PC based word processing and spreadsheet applications, including advanced functions such as graphics, pivot tables, macros and database management.
  • Thorough knowledge of financial analysis policies, practices and systems.
  • Extensive knowledge of several or all of the following: general finance theories and methodologies, discounted cash flow analysis, cost/benefit analysis, feasibility studies, large scale business planning, financial modeling and project management.

    Preferred Qualifications:
  • Bachelor-s degree in Accounting or Finance
  • A minimum of 8 years of progressive accounting experience.
  • Experience in general ledger, balance sheet reconciliations, and journal entries.
  • Strong analytical, communication and problem-solving skills.
  • Working knowledge of GAAP and internal controls.
  • Effective interpersonal skills to work with various levels of staff to deliver a high level of customer support.
  • Ability to prioritize work, meet deadlines and perform well under demanding timelines and pressure.
  • Excellent verbal and written communication skills.
  • CPA
  • Proven work-in-process (WIP) accounting experience.
  • Proven project life cycle management experience.
  • Strong organizational, communication and problem solving skills.
  • Ability to evaluate and recommend solutions on projects/problems.
  • Demonstrated ability to create and maintain reports.
  • Ability to work independently and proactively with minimal supervision.
  • Strong service orientation and team focus.
  • Experience in a large organization.
  • Experience with Business Objects, PeopleSoft and Microsoft Access.
  • Knowledge of Kaiser-s accounting processes, systems and procedures.


  •           Legal Administrative Assistant III - (Pasadena, California, United States)      Cache   Translate Page      
    Provides administrative and operational clerical support to Legal & Government Relations department attorneys, managers and/or staff. Responsibilities may include answering phones and relaying messages/information to both division staff and callers; scheduling/calendaring meetings and conferences; maintaining filing systems; ordering/stocking office supplies; opening/sorting mail. Types/proofreads/composes correspondence; creates graphs and presentations; researches issues as needed.

    Essential Functions:
  • Provides complex administrative and/or project support to the division attorneys and managers. Process sensitive and confidential information with the utmost judgment and tact, recognizing any legal privilege issues, time constraints and political implications.
  • Answers phones, takes messages, screens calls, and greets KP visitors or vendors. Uses appropriate telephone etiquette, routes calls using independent judgment; may research issues if needed and ensures follow-up. Interfaces daily with KP employees across multiple organizations and external parties as a liaison for the department.
  • Manages multiple calendars and schedules/plans meetings, and anticipates Managers upcoming work. May research, plan and arrange events including hotel and conference facilities. Makes travel arrangements.
  • Drafts routine correspondence for attorneys and other staff from brief notes or verbal requests. Organizes and assembles complex legal and other documents required for briefs, responses to state and federal agencies and subpoenas. Checks mailings to ensure that all relevant parties are included.
  • Creates documents, reports, and presentation materials with charts&illustrations, and proofs content for accuracy.
  • Gathers and inputs data, maintains established databases and document management system. Maintains physical and electronic files in accordance with accepted legal practices.
  • Researches and collects information needed to complete project tasks or reports.
  • Tracks outside counsel expenditures and process for payment; verifies recharges across business units. Manages expense reports through KPERS.
  • Ensures coverage for absent staff members. Performs other department specific duties as assigned. Participates in cross-training efforts with other support staff members.

    Basic Qualifications:
    Experience
  • Minimum five (5) years of administrative assistant experience supporting at the manager/director level, or related experience.
    Education
  • High School Diploma or General Education Development (GED) required.
    License, Certification, Registration
  • N/A

    Additional Requirements:
  • Has substantial understanding of the job, and applies knowledge and skills to complete a wide range of tasks.
  • Ability to learn and apply a thorough understanding of the organization and its functional policies and processes.
  • Strong writing skills to create difficult and more detailed correspondence.
  • Basic to intermediate knowledge of two or more of Microsoft Office Suite applications: Word, Excel, PowerPoint and/or Access, depending upon department or business needs.
  • Working knowledge of email and office equipment (fax, phone, copier, etc.).
  • Ability to coordinate multiple and difficult calendars and arrange meetings.
  • Must be able to work in a Labor/Management Partnership environment.

    Preferred Qualifications:
  • Legal experience preferred.
  • Bachelor's degree preferred.

  •           Senior Database Developer - OSI Systems, Inc. - Madhapur, Hyderabad, Telangana      Cache   Translate Page      
    Overview Rapiscan Systems, a wholly-owned subsidiary of OSI Systems, Inc. designs, manufactures and markets security and inspection systems worldwide. Our...
    From OSI Systems, Inc. - Mon, 06 Aug 2018 05:41:37 GMT - View all Madhapur, Hyderabad, Telangana jobs
              AWS Architect - Insight Enterprises, Inc. - Chicago, IL      Cache   Translate Page      
    Database architecture, Big Data, Machine Learning, Business Intelligence, Advanced Analytics, Data Mining, ETL. Internal teammate application guidelines:....
    From Insight - Thu, 12 Jul 2018 01:56:10 GMT - View all Chicago, IL jobs
              Incomedia WebSite X5 Professional 13.1.8.23 Multilingual 180913      Cache   Translate Page      

    Incomedia WebSite X5 Professional 13.1.8.23 Multilingual 180913
    [center]
    http://www.hostpic.org/images/1711270006110105.jpg

    Incomedia WebSite X5 Professional 13.1.8.23 Multilingual | 175.6 Mb

    WebSite X5 is the most versatile and complete software that lets you create attractive, professional and functional websites, blogs and online stores. You don't need any programming skills to create a website, all you need is a mouse! The software is easy to use, flexible and open to your customization. You work with a fully-visual intuitive interface, with plenty of previews of your work that are constantly updated in real time.
    [/center]

    [center]
    Incomedia WebSite X5 guarantees simplicity of use, flexibility and maximum customization so that you can create exactly the website you want.

    Browse through more than 400,000 exclusive and royalty-free photos, buttons and graphic libraries, a gallery of ready-to-use widgets, and much more.

    WebSite X5 provides a gallery of 1,500 templates. With such a vast choice available, you're sure to find the right solution for your website.

    Top Features:
    Sites with Mobile App to share news you publish
    Online store with credit card management, product availability, promotions and coupons
    Dynamic content that can be updated directly online
    Integration with database and data management using an online Control Panel
    Advanced Project Analysis and SEO optimization functions

    Working with Incomedia WebSite X5 Evolution 13 is easy. Just follow the tutorial to create and publish your very own website online. The tutorial shows the basic steps. Setting up the project, laying out the website map, creating pages, defining advanced features. And, finally, publishing your website online.

    Incomedia WebSite X5 Professional 13 is unique software for Web experts, an incredible combination of power and simplicity.

    The secret of WebSite X5's success is that you don't have to spend time learning to use complicated software. All you have to do is follow the 5 easy steps to create top quality websites. Each step has been designed to help you obtain professional results with the minimum effort.

    There's a specific tool for every job. From editing images and photos, to creating buttons, to automatically generating menus, right up to going online with the built-in FTP engine. You don't need any other software - this has it all. Save time and effort, because this software includes everything you need to create eye-catching and fully-comprehensive websites.

    Tech Specs
    Perfect for Windows 7 SP1, 8, 10 | 2 GB RAM | Min. Screen Resolution: 1024 x 600
    Compatible Windows, Linux, Unix PHP 5.x, MySQL (only for certain advanced features) servers
    Internet connection and e-mail account required for activation

    Home Page -
    http://www.websitex5.com/en/professional.html

    Buy a premium  to download file with fast speed
    thanks
    Rapidgator.net
    https://rapidgator.net/file/7f5753188e6 … l.rar.html
    alfafile.net
    http://alfafile.net/file/yoMH/mzog9.Inc … ingual.rar
    [/center]


              Incomedia WebSite X5 Professional 14.0.4.1 Multilingual 180913      Cache   Translate Page      

    Incomedia WebSite X5 Professional 14.0.4.1 Multilingual 180913
    [center]
    http://www.hostpic.org/images/1711270005380117.jpg

    Incomedia WebSite X5 Professional 14.0.4.1 Multilingual | 164.2 Mb

    WebSite X5 is the most versatile and complete software that lets you create attractive, professional and functional websites, blogs and online stores. You don't need any programming skills to create a website, all you need is a mouse! The software is easy to use, flexible and open to your customization. You work with a fully-visual intuitive interface, with plenty of previews of your work that are constantly updated in real time. Incomedia WebSite X5 guarantees simplicity of use, flexibility and maximum customization so that you can create exactly the website you want.
    [/center]

    [center]
    Browse through more than 400,000 exclusive and royalty-free photos, buttons and graphic libraries, a gallery of ready-to-use widgets, and much more.

    WebSite X5 provides a gallery of 1,500 templates. With such a vast choice available, you're sure to find the right solution for your website.

    Top Features:
    Sites with Mobile App to share news you publish
    Online store with credit card management, product availability, promotions and coupons
    Dynamic content that can be updated directly online
    Integration with database and data management using an online Control Panel
    Advanced Project Analysis and SEO optimization functions

    Main Features WebSite X5 Professional 14:
    Includes All Features of WebSite X5 Evolution 14 and much more:
    Enhanced E-Commerce tools: integration with payment processing gateways, add Coupon& Discount codes, manage inventory and orders online, store optimized for Search Engines
    APPs included to monitor an manage all your sites from your iOS or Android devices in real-time: receive stats, process store orders, check inventory and comments on your blog.
    Dynamic Content to edit your site directly online.
    The secret of WebSite X5's success is that you don't have to spend time learning to use complicated software. All you have to do is follow the 5 easy steps to create top quality websites. Each step has been designed to help you obtain professional results with the minimum effort.

    There's a specific tool for every job. From editing images and photos, to creating buttons, to automatically generating menus, right up to going online with the built-in FTP engine. You don't need any other software - this has it all. Save time and effort, because this software includes everything you need to create eye-catching and fully-comprehensive websites.

    Tech Specs
    Perfect for Windows 7 SP1, 8, 10 | 2 GB RAM | Min. Screen Resolution: 1024 x 600
    Compatible Windows, Linux, Unix PHP 5.x, MySQL (only for certain advanced features) servers
    Internet connection and e-mail account required for activation

    Home Page -
    http://www.websitex5.com/en/professional.html

    Buy a premium  to download file with fast speed
    thanks
    Rapidgator.net
    https://rapidgator.net/file/f65d5103123 … l.rar.html
    alfafile.net
    http://alfafile.net/file/yoMz/hkmqm.Inc … ingual.rar
    [/center]


              Navicat Premium 12.1.7 Full Version      Cache   Translate Page      

    Navicat Premium is an advanced multi-connections database administration tool that allows you to simultaneously connect to all kinds of database easily. Navicat enables you to connect to MySQL, MariaDB, Oracle, PostgreSQL, SQLite, and SQL Server databases from a single application, making database administration so easy. You can easily and quickly build, manage and maintain your […]

    Go to the original content here!
    Navicat Premium 12.1.7 Full Version


              Column Stats      Cache   Translate Page      
    A little while ago I added a postscript about gathering stats on a virtual column to a note I’d written five years ago and then updated with a reference to a problem on the Oracle database forum that complained that stats collection had taken much longer after the addition of a function-based index. The problem […]
              18c startup: Image consistency checking encountered an error, checking disabled      Cache   Translate Page      
    Братцы, кушать не могу!
    Вот при старте базы 18с, что видно в логе. Подкиньте идей, как понять, витально это для экземпляра или нет b d x`v djj,ot cenm

    **********************************************************************
    2018-09-11T10:44:25.990306+05:00
    Errors in file /orahome/diag/rdbms/db/OBLAKOSID/trace/OBLAKOSID_ora_1219.trc:
    ORA-27167: Attempt to determine if Oracle binary image is stored on remote server failed
    ORA-27300: OS system dependent operation:parse_df failed with status: 2
    ORA-27301: OS failure message: No such file or directory
    ORA-27302: failure occurred at: parse failed
    ORA-27303: additional information: Файловая система 1K-блоков Использовано Доступно Использовано% Cмонтировано в
    rpoo
    2018-09-11T10:44:25.990410+05:00
    Image consistency checking encountered an error, checking disabled
    LICENSE_MAX_SESSION = 0
    LICENSE_SESSIONS_WARNING = 0
    Initial number of CPU is 8
    Number of processor cores in the system is 4
    Number of processor sockets in the system is 1
    Capability Type : Network
    capabilities requested : 1 detected : 0 Simulated : 0
    Capability Type : Runtime Environment
    capabilities requested : 400000FF detected : 40000000 Simulated : 0
    Capability Type : Engineered Systems
    capabilities requested : 3 detected : 0 Simulated : 0
    Using LOG_ARCHIVE_DEST_1 parameter default value as USE_DB_RECOVERY_FILE_DEST
    Autotune of undo retention is turned on.
    IMODE=BR
    ILAT =55
    LICENSE_MAX_USERS = 0
    SYS auditing is enabled
    NOTE: remote asm mode is local (mode 0x1; from cluster type)
    NOTE: Using default ASM root directory ASM
    NOTE: Cluster configuration type = NONE [2]
    Oracle Database 18c Enterprise Edition Release 18.0.0.0.0 - Production
    Version 18.3.0.0.0.
    ORACLE_HOME: /orahome/product/18.0.0/db_1
    System name: Linux
    Node name: db.localdomain
    Release: 4.15.18-2-pve
    Version: #1 SMP PVE 4.15.18-20 (Thu, 16 Aug 2018 11:06:35 +0200)
    Machine: x86_64
    Using parameter settings in server-side spfile /orahome/product/18.0.0/db_1/dbs/spfileOBLAKOSID.ora
    System parameters with non-default values:
    processes = 320
    nls_language = "RUSSIAN"
    nls_territory = "RUSSIA"
    sga_target = 0
    memory_target = 8G
    memory_max_target = 8G
    control_files = "/orahome/oradata/DB/control01.ctl"
    control_files = "/orahome/recovery_area/DB/control02.ctl"
    db_block_size = 8192
    compatible = "18.0.0"
    db_recovery_file_dest = "/orahome/recovery_area"
    db_recovery_file_dest_size= 50G
    undo_tablespace = "UNDOTBS1"
    remote_login_passwordfile= "EXCLUSIVE"
    db_domain = "localdomain"
    dispatchers = "(PROTOCOL=TCP) (SERVICE=OBLAKOSIDXDB)"
    local_listener = "LISTENER_OBLAKOSID"
    audit_file_dest = "/orahome/admin/db/adump"
    audit_trail = "DB"
    db_name = "db"
    open_cursors = 300
    diagnostic_dest = "/orahome"
    enable_pluggable_database= TRUE
    NOTE: remote asm mode is local (mode 0x1; from cluster type)
              How much does it cost to keep a dog?      Cache   Translate Page      

    We’re a nation of dog lovers, but our furry friends come with a pretty hefty price tag, with an average cost of around £21,000 over their lifetime.

    But some dogs, particularly large, pedigree breeds could set you back an eye-watering £33,000.

    Average cost of buying a dog

    There are far too many options to give a definitive answer, but if we break down the cost of buying a dog, you start to get an idea about how quickly prices can go up.

    A dog from a registered breeder or rescue centre generally costs a tail wagging £50 to £150.  But, popular small breeds, like a pug, or hybrid such as a labradoodle, can cost about £1,000.

    Large pedigree dogs and some rarer breeds can cost you several thousand. The most expensive dog in the world is rumoured to be a Tibetan Mastiff, which cost the owner in China about £1 million.

    Average cost of dog food

    You’re probably looking at around £200 to £400 a year to feed your dog, which means an average cost of dog food per month of around £25, but there are a lot of factors to take into consideration.

    Larger dogs might cost a lot more to feed, while smaller dogs will cost a bit less.

    There is also a massive range in the cost of pet food. While there are a number of cheaper options on the market, these are often bulked out with cheaper fillers like oats.

    The more expensive options should contain more meat, which is better for you dog, but you should check the ingredients to make sure you get what you’re paying for.

    Average cost of dog toys and bedding

    Assuming you’re not buying your dog designer jackets and its own four poster bed, there are a lot of ways to save in this area.

    But even so, there are a lot of things your dog will need and there’s a good chance you will need to replace them on a regular basis.

    If we’re taking a wide range, you’re probably looking at between £200 and £400 for toys, bedding, leads and the various other things your dog will need.

    Average dog insurance cost

    Pet insurance is a complicated subject and you need to think carefully and take a lot of things into consideration.

    Research by Which? found the average cost of a lifetime policy for a dog is £472 a year.

    You could save some money by just getting a one-year policy. But there’s a catch. While this might be cheaper in the early years, it will start to get much, much more expensive when your dog reaches six or seven.

    By the time your dog gets to eight or nine, or even younger for some breeds, you will not be able to get any dog insurance at all. And this is just the time they’re likely to start developing health problems.

    But if you want to look at a cost that’ll really get the fur flying, check out how much the vet bills can be.

    Average vet bills and medical costs for your dog

    Vet bills can be really expensive. Surgery for broken limbs costs on average £1,500, while more significant treatments, like chemotherapy can be as much as £5,000.

    But often it’s the ongoing costs of long-term illnesses, like diabetes, which can really set you back.

    Hopefully your furry friend keeps itself out of the worst kinds of scrapes, but even common problems can set you back hundreds.

    Hip dysplasia, a regular issue with popular larger dogs like German Shepherds, Golden Retrievers and Labradors, can cost £500.

    Average puppy and dog vaccinations cost

    The first round of jabs for your puppy will set you back about £100, but you will also need annual booster vaccinations, which cost about £50 a go.

    If you want to avoid this initial cost, then many rescue shelters will have your puppy vaccinated before you’re allowed to take them home.

    Average dog neutering cost or cost to spay a dog

    On average, spaying or neutering your dog will cost between £60 and £180. However, larger dogs might cost quite a bit more.

    Average dog teeth cleaning costs

    Regularly cleaning your dog's’ teeth is vital for their overall health and can save you lots of money in the long-run.

    A tube of dog toothpaste (don’t use human toothpaste. It contains fluoride and is poisonous to dogs) only costs a few quid.

    A full clean and descale at the vets is probably going to set you back between £100 and £200 depending on the size of the dog, as they need to be put under an anesthetic.

    But the prices really start to go up if your dog needs to start having serious dental work or teeth removed. You can easily be looking at £400 to £500 or even more.

    Average dog microchipping cost

    This is a cost you can’t avoid. Microchipping your dog is a legal requirement and if you don’t you can be hit with a fine of up to £500. Luckily, it’s not very expensive, with an average cost of between £15 and £20 to add your feline friend to the dog microchip database

    Average cost of flea and worming treatments for your dog

    This is going to set you back about £10 a month. Worming treatments are more expensive at £10 to £15, but only need to be done every three months, while flea treatments cost about £5 a month.

    How much does a  dog walker cost?

    If you’re working, you might want someone to come in a walk your dog for you during the day.

    There are a large number of companies and individuals offering this service and dog walking prices can vary depending on how long they take your dog for and how much exercise they need.

    A rough average price would be £10 for an hours walk, but obviously you’ll be looking at more if you’re after a day-care service.

    Average dog passport cost

    A pet passport for your dog is going to cost between £150 and £250, by the time you factor in the extra jabs required. But there are other things to consider.

    First, you will need to have them treated for tapeworm before taking them back into the UK and this must be done no later than 24 hours before you travel.

    Second, if you’re abroad, your pet insurance might not cover you.

    Average cost of dog boarding kennels

    Does your dog really want to go on holiday with you? Well if you decide to leave them at home, you need to be prepared to pay some hefty kennel fees.

    On average, you’re probably looking at around £17 a day, plus extras for a decent boarding kennel. A live-in pet sitter is probably going to be a bit more expensive at around £25 a day, but your dog will be safe and sound in their own home.

    How much does it cost to put a dog down?

    When it’s time to say goodbye to your dog, you will usually be charged between £30 and £50 to have them put to sleep at the vets.

    Some vets might do home visits, and this will set you back between £70 and £100.

    Dog cremation costs

    Cremation costs vary depending on whether or not you want the ashes back.

    An individual cremation, which means you do get the ashes back, will cost between £150 and £300.

    If you don’t want the ashes back, then you should only be paying about £50.

    You can get a combined euthanasia and cremation service with some vets, but this will not save you much money.


              Forum Post: RE: AX 2012 POS database size is increasing day by day      Cache   Translate Page      
    In case you are not taking the backups, check the SQL database settings. If the recovery model is set to simple, it will not create the logs. To get rid of the logs, you need to perform a database backup first. See also the next article: docs.microsoft.com/.../recovery-models-sql-server
              Forum Post: RE: Mandatory property set as yes not working for a form control which is not linked to data source      Cache   Translate Page      
    Hi Palak, In case it is an unbound control (as I understand from the title) , you have to write your own validation logic. The mandatory part will be checked upon saving a record for database fields.
              Database Administrator - L3 - Promaxis Systems Inc. - Ottawa, ON      Cache   Translate Page      
    Promaxis is located in Ottawa, Ontario, Canada. To be considered for similar jobs, fill out a general application on the Promaxis careers page....
    From Promaxis Systems Inc. - Wed, 05 Sep 2018 06:28:44 GMT - View all Ottawa, ON jobs
              Radenso XP Radar & Laser Detector with GPS Lockout and Red Light/Speed Camera Voice Alerts      Cache   Translate Page      
    • Top-flight sensitivity delivers radar alerts up to several miles away, while best-in-class blind spot monitor/traffic monitor filtering prevents false alerts. The Radenso XP is the quietest radar detector available.
    • GPS Lockout capability lets the Radenso XP remember common false alerts along your regularly driven routes so you never have to listen to the same false alert twice. Simply press and hold a button to add a GPS lockout.
    • Additional features include automatic muting below a user-selectable speed, automatic sensitivity adjustment based on current speed and a built in red light and speed camera database with free updates.
    • Telephone and e-mail support from the USA - 1 Year Manufacturer Warranty. Full money back guarantee for 30 days, no questions asked!
    • Backed by the noLimits Enterprises Radar Ticket Free Guarantee!

              The Biggest Sin of Commercial Open Source?      Cache   Translate Page      

    Redis is a popular open source database. Its proprietor, Redis Labs, recently announced that some add-on modules will not be open source any longer . The resulting outcry led to a defense and explanation of this decision that is telling. I have two comments and a lesson about product management of commercial open source.

    The two comments are about messaging, both ways: What Redis Labs is telling the world and what the open source world is telling Redis Labs and the rest of the world.

    Firstly, by restricting usage of these add-on modules, Redis Labs is admitting that someone else is successfully competing with them in their own game. The bogeyman is Amazon, in this case, as called out in the defense of the Redis Labs decision. Usually, the original proprietor of some open source software is in a prime position to profit from it, being the most trustworthy choice for customers and being able to charge a premium. Amazon has to white-label the software (remove any of Redis Labs trademarks) and still is a serious threat. Admitting this is as embarrassing as it gets.

    Secondly, rather than taking any future development of the add-on modules private, Redis Labs decided to fiddle with the license by adding a rider that makes it non-open-source but still permits many open-source-like behaviors. The intent may have been to stay as close to open source as possible, but this behavior has nevertheless annoyed many open source enthusiasts, leading to the aforementioned outcry. Thus, the open source world is telling Redis Labs that of the many sins you can commit as a commercial open source company, the worst is to go back on your legal (read: license) open source promise.

    You may like reading this older article about how the single vendor commercial open source business model works. Also, there are other ways of curtailing your competitors, if you must: Read more about the kitchen cabinet of open source poisons .

    Read on about the main product management challenge that all commercial open source companies are facing .


              Running Apache Cassandra on Kubernetes      Cache   Translate Page      

    As Kubernetes becomes the de facto solution for container orchestration, more and more developers (and enterprises) want to run Apache Cassandra databases on Kubernetes . It's easy to get started―especially given the capabilities that Kubernetes' StatefulSets bring to the table. Kubernetes, though, certainly has room to improve when it comes storing data in-state and understanding how different databases work.

    For example, Kubernetes doesn't know if you're writing to a leader or a follower database, or to a multi-sharded leader infrastructure, or to a single database instance. StatefulSets―workload API objects used to manage stateful applications―offer the building blocks required for stable, unique network identifiers; stable persistent storage; ordered and smooth deployment and scaling, deletion, and termination; and automated rolling updates. However, while getting started with Cassandra on Kubernetes might be easy, it can still be a challenge to run and manage.

    To overcome some of these hurdles, we decided to build an open source Cassandra operator that runs and operates Cassandra within Kubernetes; you can think of it as Cassandra-as-a-Service on top of Kubernetes. We've made this Cassandra operator open source and freely available on GitHub. It remains a work in progress by our Instaclustr team and our partner contributors―but it is functional and ready for use. The Cassandra operator supports Docker images, which are open source and also available from the project's GitHub repository.

    More on Kubernetes

    What is Kubernetes? How to make your first upstream Kubernetes contribution Getting started with Kubernetes Automated provisioning in Kubernetes Test drive OpenShift hands-on

    While it's possible for developers to build scripts for managing and running Cassandra on Kubernetes, the Cassandra operator offers the advantage of providing the same consistent, reproducible environment, as well as the same consistent, reproducible set of operations through different production clusters. (This is true across development, staging, and QA environments.) Also, because best practices are already built into the operator, development teams are spared operational concerns and can focus on their core capabilities.

    What is a Kubernetes operator?

    A Kubernetes operator consists of two components: a controller and a custom resource definition (CRD). The CRD allows devs to create Cassandra objects in Kubernetes. It's an extension of Kubernetes that allows us to define custom objects or resources using Kubernetes that our controller can then listen to for any changes to the resource definition. Devs can define an object in Kubernetes that contains configuration options for Cassandra, such as cluster name, node count, JVM tuning options, etc.―all the information you want to give Kubernetes about how to deploy Cassandra.

    You can isolate the Cassandra operator to a specific Kubernetes namespace, define what kinds of persistent volumes it should use, and more. The Cassandra operator's controller listens to state changes on the Cassandra CRD and will create its own StatefulSets to match those requirements. It will also manage those operations and can ensure repairs, backups, and safe scaling as specified via the CRD. In this way, it leverages the Kubernetes concept of building controllers upon other controllers in order to achieve intelligent and helpful behaviors.

    So, how does it work? cassandra-operator-architecture.png
    Running Apache Cassandra on Kubernetes

    Architecturally, the Cassandra controller connects to the Kubernetes Master. It listens to state changes and manipulates pod definitions and CRDs. It then deploys them, waits for changes to occur, and repeats until all necessary changescomplete fully.

    The Cassandra controller can, of course, perform operations within the Cassandra cluster. For example, want to scale down your Cassandra cluster? Instead of manipulating the StatefulSet to handle this task, the controller will see the CRD change. The node count will change to a lower number (say from six to five). The controller will get that state change, and it will first run a decommission operation on the Cassandra node that will be removed. This ensures that the Cassandra node stops gracefully and redistributes and rebalances the data it holds across the remaining nodes. Once the Cassandra controller sees this has happened successfully, it will modify that StatefulSet definition to allow Kubernetes to decommission that pod. Thus, the Cassandra controller brings needed intelligence to the Kubernetes environment to run Cassandra properly and ensure smoother operations.

    As we continue this project and iterate on the Cassandra operator, our goal is to add new components that will continue to expand the tool's features and value. A good example is Cassandra SideCar (shown in the diagram above), which can take responsibility for tasks like backups and repairs. Current and future features of the project can be viewed on GitHub . Our goal for the Cassandra operator is to give devs a powerful, open source option for running Cassandra on Kubernetes with a simplicity and grace that has not yet been all that easy to achieve.

    Topics

    Kubernetes

    About the author
    Running Apache Cassandra on Kubernetes
    Ben Bromhead

    Ben Bromhead is Chief Technology Officer and Co-Founder at Instaclustr , an open source-as-a-service company. Ben is located in Instaclustr's California office and is active in the Apache Cassandra community. Prior to Instaclustr, Ben had been working as an independent consultant developing NoSQL solutions for enterprises, and he ran a high-tech cryptographic and cyber security formal testing laboratory at BAE Systems and Stratsec.

    More about me

    Learn how you can contribute
              5 Ways to Tackle Big Graph Data with KeyLines and Neo4j      Cache   Translate Page      

    5 Ways to Tackle Big Graph Data with KeyLines and Neo4j

    ByDan Williams, Product Manager, Cambridge Intelligence | September 11, 2018

    Reading time: 6 minutes

    Understanding big graph data requires two things: a robustgraph database and a powerful graph visualization engine. That’s why hundreds of developers have combinedNeo4j with the KeyLines graph visualization toolkit to create effective, interactive tools for exploring and making sense of their graph data.

    But humans are not big data creatures. Given most adults can store between 4-7 items only in their short-term memory, loading an overwhelming quantity of densely-connected items into a chart won’t generate insight.

    That presents a challenge for those of us building graph analysis tools.

    How do you decide which subset of data to present to users? How do they find the most important patterns and connections?

    That’s what we explore in this blog post. You’ll discover that, with some thoughtful planning, big data doesn’t have to be a big problem.

    The Challenge of Massive Graph Visualization

    For many organizations, “big data” means collecting every bit of information available and then figuring out how to use it later. One of the many problems with this approach is that it’s incredibly challenging to go beyond aggregated analysis to understand individual elements.


    5 Ways to Tackle Big Graph Data with KeyLines and Neo4j
    20,000 nodes visualized in KeyLines. Pretty, but pretty useless if you want to understand specific node behavior. Data from The Cosmic Web Project .

    To provide your users with something more useful, you need to think about the data funnel . Through different stages of backend data management and front-end interactions, the funnel reduces billions of data points into something a user can comprehend.


    5 Ways to Tackle Big Graph Data with KeyLines and Neo4j
    The data funnel to bring big data down to a human scale.

    Let’s focus on the key techniques you’ll apply at each stage of the funnel:

    1. Filtering in Neo4j: ~1,000,000+ nodes

    There’s no point visualizing your entire Neo4j instance. You want to remove as much noise as possible, as early as possible. Filtering withCypher queries is an incredibly effective way to do this.

    KeyLines’ integration with Cypher means giving users some nice visual ways to create custom filtering queries, like sliders, tick-boxes or selecting from a list of cases.

    In the example below, we’re using Cypher queries to power a “search and expand” interaction in KeyLines:

    MATCH (movie:Movie{title: $name})<-[rel]-(actor:Actor)
    RETURN *, { id: actor.id, degree: size((actor:Actor) --> (:Movie)) } as degree

    First, we’re matching Actors related to a selected Movie before returning them to be added to our KeyLines chart:

    [IMAGE 3] There’s no guarantee that filtering through search is enough to keep data points at a manageable level. Multiple searches might return excessive amounts of information that’s difficult to analyze.

    Filtering is effective, but it shouldn’t be the only technique you use.

    2. Aggregating in Neo4j: ~100,000 nodes

    Once filtering techniques are in place, you should consider aggregation. There are two ways to approach this.

    First, there’s data cleansing to remove duplicates and errors. This is often time-consuming but, again, Cypher is your friend. Cypher functions like “count” make it really easy to aggregate nodes in the backend:

    MATCH (e1:Employee)-[m:MAILS]->(e2:Employee) RETURN e1 AS sender, e2 AS receiver, count(m) AS sent_emails

    Second, there’s a data modeling step to remove unnecessary clutter from entering the KeyLines chart in the first place.

    Questions to ask in terms of decluttering: Can multiple nodes be merged? Can multiple links be collapsed into one?

    It’s worth taking some time to get this right. With a few simple aggregation decisions, it’s possible to reduce tens of thousands of nodes into a few hundred.


    5 Ways to Tackle Big Graph Data with KeyLines and Neo4j
    Using link aggregation, we’ve reduced 22,000 nodes and links into a much more manageable chart. 3. Create a Clever Visual Model: ~10,000 1,000 nodes

    By now, Neo4j should have already helped you reduce 1,000,000+ nodes to a few hundred. This is where the power of data visualization really shines. Your user’s visualization relies on a small proportion of what’s in the database, but we may then use visual modelling to simplify it further.

    The below chart shows graph data relating to car insurance claims. Our Neo4j database includes car and policyholders, phone numbers, insurance claims, claimants, third parties, garages and accidents:


    5 Ways to Tackle Big Graph Data with KeyLines and Neo4j

    Loading the full data model is useful, but with some carefully considered re-modelling, the user may select an alternative approach suited to the insight they need.

    Perhaps they want to see direct connections between policyholders and garages:

    [IMAGE 6] Or the user may want a view that removes unnecessary intermediate nodes and shows connections between the people involved: [IMAGE 7] The ideal visual data model will depend on the questions your users are trying to answer. 4. Filters, Combining and Pruning: ~1,000 nodes

    Now that your users have the relevant nodes and links in their chart, you should give them the tools to declutter and focus on their insight.

    A great way to do this is filtering by adding or removing subsets of the data on demand. For better performance, present them with a filtered view first, but give the user control options to bring in data. There are plenty of ways to do this tick boxes, sliders, the time bar or “expand and load.”

    Another option is KeyLines’ combos functionality. Combos allow the users to group certain nodes, giving a clearer view of a large dataset without actually removing anything from the chart. It’s an effective way to simplify complexity, but also to offer a “detail on demand” user experience that makes graph insight easier to find.


    5 Ways to Tackle Big Graph Data with KeyLines and Neo4j
    Combos clear chart clutter and clarify complexity.

    A third example of decluttering best practices is to remove unnecessary distractions from a chart. This might mean giving users a way to “prune” leaf nodes, or making it easy to hide “super nodes” that clutter the chart and obscure insight.

    [IMAGE 9] Leaf, orphan and super nodes rarely add anything to your graph data understanding, so give users an easy way to remove them.

    KeyLines offers plenty of tools to help with this critical part of your graph data analysis. This video on managing chart clutter explains a few more.

    5. Run a Layout: ~100 nodes

    By this point, your users should have a tiny subset of your original Neo4j graph data in their chart. The final step is to help them uncover insight. Automated graph layouts are great for this.

    A good force-directed layout goes beyond simply detangling links. It should also help you see the patterns, anomalies and clusters that direct the user towards the answers they’re looking for.


    5 Ways to Tackle Big Graph Data with KeyLines and Neo4j
    KeyLines’ latest layout the organic layout. By spreading the nodes and links apart in a distinctive fan-like pattern, the underlying structure becomes much clearer.

    With an effective, consistent and powerful graph layout, your users will find that answers start to jump out of the chart.

    Bonus Tip: Talk to Your Users

    This blog post is really just a starting point. There are plenty of other tips and techniques to help you solve big graph data challenges (we’ve not even started on temporal analysis or geospatial visualization).

    Probably the most important tip of all is this: Take time to talk to your users .

    Find out what data they need to see and the questions they’re trying to answer. Use the data funnel to make that process as simple and fast as possible, and use the combined powers of Neo4j and KeyLines to turn the biggest graph datasets into something genuinely insightful.

    Visit our website to learn more about graph visualization best practices or get started with the KeyLines toolkit.

    Cambridge Intelligence is a Gold Sponsor of GraphConnect 2018. Use code CAM20 to get 20% off your ticket to the conference and training sessions, and we’ll see you in New York!

    Meet graph experts from around the globe working on projects just like this one when you attend GraphConnect 2018 on September 20-21. Grab the discount code above and get your ticket today.

    Get My (Discounted!) Ticket
              Re: Add a question from question bank      Cache   Translate Page      
    by Henrique F Machado.  

    Thanks for everyone that helped. We also faced this issue at work and this discussion helped a lot.

    One thing that I found out is that the question's corruption in the database prevents us from editing and deleting it, but not from moving it. So, while we prepare the backups and all we need to go directly to the database, we created a "Garbage questions (don't use)" category below "System" and moved the problematic questions there. Now there's no more corrupted question inside the course's question bank so inserting questions from the bank to a quiz is working. I don't know if there's any adverse side effects to this, though.

    Using Moodle 3.5.1 (Build: 20180709)


              Effective Internal Risk Models FRTB: The Importance of Risk Model Approval      Cache   Translate Page      

    Effective Internal Risk Models FRTB: The Importance of Risk Model Approval

    ByNavneet Mathur, Senior Director of Global Solutions, Neo4j | September 10, 2018

    Reading time: 4 minutes

    Sweeping regulations are changing the waybanks handle risk. The Fundamental Review of the Trading Book (FRTB) represents an important shift designed to provide a firm foundation for the future. While laws passed after the financial crisis offered a patchwork, the FRTB is a change that offers banks a motivation for putting in place a strong infrastructure for the future.

    In this series on the FRTB, we explore what it takes to create effective internal risk models using agraph database likeNeo4j. This week, we’ll look at the major areas impacted by the FRTB, including raising risk reserves, the trading desk, and the role and approval of internal risk models.

    What Is the FRTB?

    Fundamental Review of the Trading Book (FRTB) regulations are part of the upcoming Basel IV set of reforms and create specific capital-reserve requirements for bank trading desks based on investment-risk models. The new regulations require banks to reserve sufficient capital to maintain solvency through market downturns and avoid the need for governmental bailouts.

    Banks are using FRTB mandates as an opportunity to build a firm foundation for future risk management and compliance applications that lowers development and staffing expenses, optimizes reserve ratios, maximizes available capital and drives investment profits.

    FRTB Raises Basel Reserve Requirements

    In the financial crisis a decade ago, banks worldwide held large risk exposures in their trading books without sufficient capital reserves to weather the length and depth of the plunge in investment markets. As a result, regulators created new data management and capital-reserve requirements to avoid another market meltdown.

    In turn, banks created risk compliance models that were tested and approved by regulators. But at many institutions, those models were not maintained, and as time passed, market and internal changes exposed the banks to new investment risks.

    Today, risk-compliance problems are addressed by BCBS 239 (Basel Committee on Banking Supervision standard 239) and FRTB (Fundamental Review of the Trading Book) regulations. BCBS 239 puts forth principles for risk-data governance, aggregation and reporting, and associated IT infrastructure.

    FRTB standards which are part of BCBS and the upcoming Basel IV set of reforms create specific capital-reserve requirements for bank trading desks based on investment-risk models.

    Intense Focus on the Trading Desk

    FRTB regulators develop guidelines that require banks to reserve sufficient capital to maintain solvency through market downturns and avoid the need for governmental bailouts.

    The reserve requirements for trading books are higher than banking books, tempting institutions to engage in regulatory arbitrage the movement of assets between books to affect reserve requirements a practice that is now being tightly scrutinized and regulated.


    Effective Internal Risk Models FRTB: The Importance of Risk Model Approval
    Effective Internal Risk Models FRTB: The Importance of Risk Model Approval
    The Role of Internal Risk Models

    FRTB regulations include default reserve calculations that result in measurably higher capital requirements designed to account for new levels of trading-book risk unaccounted for by the Basel II risk framework. The higher capital requirements translate directly to lower levels of investment capital, flexibility, revenues and profits.

    Banks may accept BCBS’s reserve calculations or develop their own internal risk models to calculate capital-reserve requirements. To use internal-model results, banks must obtain the approval of national regulators by proving how well models represent risk in the banks’ investment strategies.

    The approval process requires a bank to forecast hypothetical profits and losses using its model’s calculated capital reserves as well as to backtest the model with real pricing and holdings data. FRTB also requires that internal models implement expected shortfall calculations to address outlying tail risks in investment strategies.

    The Importance of Risk Model Approval

    To satisfy supervisory authorities of the accuracy of an internally developed risk model, banks must prove all of the following:

    Their data is complete, accurate and consistent; and the components of the risk model can be traced back to original, authoritative data sources There is sufficient pricing and transaction history to test the model back to 2007 Their aggregation rules are accurate and comply with BCBS regulations Their risk models are sufficiently realistic and robust to represent market realities in normal and emergency situations Their framework models historical, current and what-if market scenarios Their policies and procedures for data governance, aggregation and validation are complete and consistently enforced Their IT infrastructure handles inter-day fair-market evaluations, scheduled reports, and ad hoc requests from internal and external risk supervisors

    If a bank fails the regulatory audit, regulators use standard BCBS formulas to determine substantially higher amounts of capital that the bank must reserve to cover potential losses.

    If the internal model passes the audit, the model’s calculated capital reserve requirements replace regulators’ default reserve requirements as well as traditional value-at-risk (VaR) measures.


    Effective Internal Risk Models FRTB: The Importance of Risk Model Approval
    Conclusion

    Internal risk model approval leads to lower reserves and higher levels of investment capital, flexibility, revenue and profits. FRTB mandates higher default reserve requirements than those calculated by banks’ internal risk models.

    Implementing this demands the ability to trace data dependencies through many levels of complexity. A graph database offers an effective way to capture all these connections at scale. Neo4j is the world’s leading graph platform.

    In the coming weeks, we’ll explore how to trace data lineage across data silos and how traditional technologies like spreadsheets, relational databases, and data warehouses fall short. We’ll dive into why banks need a modern graph platform as the foundation for effective internal risk models that meet FRTB requirements.

    Risk demands a strong foundation

    Effective Internal Risk Models Require a New Technolgoy Foundation Read the White Paper
              The Data Day: August 31, 2018      Cache   Translate Page      

    AWS and VMware announce Amazon RDS on VMware. And more.

    For @451Research clients: On the Yellowbrick road: data-warehousing vendor emerges with funding and flash-based EDW https://t.co/shKUTosHlS By @jmscrts

    ― Matt Aslett’s The Data Day (@thedataday) August 31, 2018

    For @451Research clients: Automated analytics: the role of the machine in corporate decision-making https://t.co/3PkCXnGfhR By Krishna Roy

    ― Matt Aslett’s The Data Day (@thedataday) August 28, 2018

    For @451Research clients: @prophix does cloud and on-premises CPM, with machine learning up next https://t.co/8FKKvRrJDb By Krishna Roy

    ― Matt Aslett’s The Data Day (@thedataday) August 28, 2018

    AWS and VMware have announced Amazon Relational Database Service on VMware, supporting Microsoft SQL Server, Oracle, PostgreSQL, mysql, and MariaDB. https://t.co/hy5F1g8dTA

    ― Matt Aslett’s The Data Day (@thedataday) August 27, 2018

    Cloudera has launched Cloudera Data Warehouse (previously Cloudera Analytic DB) as well as Cloudera Altus Data Warehouse as-a-service https://t.co/386z7HaT6Q and also Cloudera Workload XM, an intelligent workload experience management cloud service https://t.co/v5jGb3Hkp0

    ― Matt Aslett’s The Data Day (@thedataday) August 30, 2018

    Alteryx has announced version 2018.3 of the Alteryx analytics platform, including Visualytics for real-time, interactive visualizations https://t.co/8ewTXJqs5T

    ― Matt Aslett’s The Data Day (@thedataday) August 28, 2018

    Informatica has updated its Master Data Management, Intelligent Cloud Services and Data Privacy and Protection products with a focus on hybrid, multi-cloud and on-premises environments. https://t.co/eGGrA28trh

    ― Matt Aslett’s The Data Day (@thedataday) August 29, 2018

    SnapLogic has announced the general availability of SnapLogic eXtreme, providing data transformation support for big data architectures in the cloud. https://t.co/NijnMNLTx0

    ― Matt Aslett’s The Data Day (@thedataday) August 28, 2018

    VoltDB has enhanced its open source VoltDB Community Edition to support real-time data snapshots, advanced clustering technology, exporter services, manual scale-out on commodity servers and access to the VoltDB Management Console. https://t.co/tEHblf4J7v

    ― Matt Aslett’s The Data Day (@thedataday) August 30, 2018

    ODPi has announced the Egeria project for the open sharing, exchange and governance of metadata https://t.co/tEb0jRHV8F

    ― Matt Aslett’s The Data Day (@thedataday) August 28, 2018

    And that’s the data day


              Top Ambari Interview Questions and Answers 2018      Cache   Translate Page      
    1. Ambari Interview Preparation

    In our last article, we discussed Ambari Interview Questions and Answers Part 1 . Today, we will see part 2 of top Ambari Interview Questions and Answers. This part contains technical and practical Interview Questions of Ambari, designed by Ambari specialist. If you are preparing for Ambari Interview then you must go through both parts of Ambari Interview Questions and answers. These all are researched questions which will definitely help you to move ahead.

    Still, if you face any confusion in these frequently asked Ambari Interview Questions and Answers, we have provided the link of the particular topic. Given links will help you learn more about Apache Ambari .


    Top Ambari Interview Questions and Answers 2018

    Top Ambari Interview Questions and Answers 2018

    2. Best Ambari Interview Questions and Answers

    Following are the most asked Ambari Interview Questions and Answers, which will help both freshers and experienced. Let’s discuss these questions and answers for Apache Ambari

    Que 1. What are the purposes of using Ambari shell?

    Ans.Ambari Supports:

    All the functionalities which are available through Ambari web-app. It supports the context-aware availability of commands. completion of a tab. Also, offers optional and required parameter support. Que 2. What is the required action you need to perform if you opt for scheduled maintenance on the cluster nodes?

    Ans.Especially, for all the nodes in the cluster, Ambari offers Maintenance mode option. Hence before performing maintenance, we can enable the maintenance mode of Ambari to avoid alerts.

    Que 3. What is the role of “ambari-qa” user?

    Ans.‘ambari-qa’ user account performs a service check against cluster services that are created by Ambari on all nodes in the cluster.

    Que 4. Explain future growth of Apache Ambari?

    Ans.We have seen the massive usage of data analysis which brings huge clusters in place, due to increasing demand for big data technologies like Hadoop. Hence, more visibility companies are leaning towards the technologies like Apache Ambari, for better management of these clusters with enhanced operational efficiency.

    In addition, HortonWorks is working on Ambari to make it more scalable. Thus, gaining knowledge of Apache Ambari is an added advantage with Hadoop also.

    Que 5. State some Ambari components which we can use for automation as well as integration?

    Ans.Especially, for automation and Integration, components of Ambari which are imported are separated into three pieces, such as:

    Ambari Stacks Blueprints of Ambari Ambari API

    However, to make sure that it deals with automation and integration problems carefully, Ambari is built from scratch.

    Que 6. In which language is the Ambari Shell is developed?

    Ans.In Java language , Ambarishell is developed. Moreover, it is based on Ambari REST client as well as the spring shell framework .

    Que 7. State benefits of Hadoop users by using Apache Ambari.

    Ans.We can definitely say, the individuals who use Hadoop in their day to day work life, the Apache Ambari is a great gift for them. So, benefits of Apache Ambari :

    Simplified Installation process. Easy Configuration and management. Centralized security setup process. Full visibility in terms of Cluster health. Extendable and customizable.

    Que 8. Name some independent extensions that contribute to the Ambari codebase?

    Ans.They are:

    1. Ambari SCOM Management Pack

    2. Apache Slider View

    Ambari Interview Questions and Answers for freshers Q. 1,2,4,6,7,8 Ambari Interview Questions and Answers for experienced Q. 3,5

    Que 9. Can we use Ambari python Client to use of Ambari API’s?

    Ans.Yes.

    Que 10. What is the process of creating an Ambari client?

    Ans.To create an Ambari client, the code is:

    from ambari_client.ambari_api import AmbariClient headers_dict={'X-Requested-By':'mycompany'} #Ambari needs X-Requested-By header client = AmbariClient("localhost", 8080, "admin", "admin", version=1,http_header=headers_dict) print client.version print client.host_url print"n" Que 11. How can we see all the clusters that are available in Ambari?

    Ans.In order to see all the clusters that are available in Ambari , the code is:

    all_clusters = client.get_all_clusters() print all_clusters.to_json_dict() print all_clusters Que 12. How can we see all the hosts that are available in Ambari?

    Ans.To see all the hosts that are available in Ambari, the code is:

    all_hosts = client.get_all_hosts() print all_hosts print all_hosts.to_json_dict() print"n" Que 13. Name the three layers, Ambari supports?

    Ans.Ambari supports several layers:

    Core Hadoop Essential Hadoop Hadoop Support

    Learn More about Hadoop

    Que 14. What are the different methods to set up local repositories?

    Ans.To deploy the local repositories, there are two ways:

    Mirror the packages to the local repository. Else, download all the Repository Tarball and start building the Local repository

    Que 15. How to set up local repository manually?

    Ans.In order to set up a local repository manually, steps are:

    At very first, set up a host with Apache httpd. Further download Tarball copy for every repository entire contents. However, one has to extract the contents, once it is downloaded. Ambari Interview Questions and Answers for freshers Q. 13,14,15 Ambari Interview Questions and Answers for experienced Q. 10,11,12 Que 16. How is recovery achieved in Ambari?

    Ans.Recovery happens in Ambari in the following ways:

    Based on actions

    In Ambari after a restart master checks for pending actions and reschedules them since every action is persisted here. Also, the master rebuilds the state machines when there is a restart, as the cluster state is persisted in the database. While actions complete master actually crash before recording their completion, when there is a race condition. Well, the actions should be idempotent this is a special consideration taken. And, those actions which have not marked as complete or have failed in the DB, the master restarts them. We can see these persisted actions in Redo Logs.

    Based on the desired state
              UCLA's infatuation with diversity is a costly diversion from its true mission      Cache   Translate Page      

    By Heather Mac Donald
    http://www.latimes.com/opinion/op-ed/la-oe-mac-donald-diversity-ucla-20180902-story.html
    September 2, 2018



    Image result for ucla diversity protest
    Students who support the passage of a new diversity requirement proposal rallied at Meyerhoff Park in 2015.

    If Albert Einstein applied for a professorship at UCLA today, would he be hired? The answer is not clear. Starting this fall, all faculty applicants to UCLA must document their contributions to “equity, diversity and inclusion.” (Next year, existing UCLA faculty will also have to submit an “equity, diversity and inclusion statement” in order to be considered for promotion, following the lead of five other UC campuses.) The mandatory statements will be credited in the same manner as the rest of an applicant’s portfolio, according to UCLA’s equity, diversity and inclusion office.

    A contemporary Einstein may not meet the suggested evaluation criteria. Would his “job talk” — a presentation of one’s scholarly accomplishments — reflect his contributions to equity, diversity and inclusion? Unlikely. Would his research show, in the words of the evaluation template, the “potential to understand the barriers facing women and racial/ethnic minorities?” Also unlikely. Would he have participated in “service that applies up-to-date knowledge to problems, issues and concerns of groups historically underrepresented in higher education?” Sadly, he may have been focusing on the theory of general relativity instead. What about “utilizing pedagogies addressing different learning styles” or demonstrating the ability to “effectively teach and attract students from underrepresented communities”? Again, not at all guaranteed.

    As the new mandate suggests, UCLA and the rest of the University of California have been engulfed by the diversity obsession. The campuses are infatuated with group identity and difference. Science and the empirical method, however, transcend just those trivialities of identity that UC now deems so crucial: “race, ethnicity, gender, age, religion, language, abilities/disabilities, sexual orientation, gender identity and socioeconomic status,” to quote from the university’s Diversity Statement. The results of that transcendence speak for themselves: an astounding conquest of disease and an ever-increasing understanding of the physical environment. Unlocking the secrets of nature is challenge enough; scientists (and other faculty) should not also be tasked with a “social justice” mission.

    But such a confusion of realms currently pervades American universities, and UC in particular. UCLA’s Intergroup Relations Office offers credit courses and “co-curricular dialogues” that encourage students to, you guessed it, “explore their own social identities (i.e. gender, race, nationality, religion/spirituality, sexual orientation, social class, etc.) and associated positions within the campus community.” Even if exploring your social identity were the purpose of a college education (which it is not), it would be more fruitful to define that identity around accomplishments and intellectual passions — “budding mathematician,” say, or “history fanatic” — rather than gender and race.

    Intergroup Relations is just the tip of the bureaucratic diversity iceberg. In 2015, UCLA created a vice chancellorship for equity, diversity and inclusion, funded at $4.3 million, according to figures published by the Millennial Review in 2017. (The EDI vice chancellor’s office did not have its current budget “at the ready,” a UCLA spokesman said, nor did Intergroup Relations.) Over the last two years, according to the Sacramento Bee’s state salary database, the diversity vice chancellor’s total pay, including benefits, has averaged $414,000, more than four times many faculty salaries. Besides his own staff, the vice chancellor for equity, diversity and inclusion presides over the Discrimination Prevention Office; BruinX, the “research and development arm of EDI”; faculty “equity advisors”; UCLA’s Title IX office; and a student advisory board. Various schools at UCLA, including medicine and dentistry, have their own diversity deans, whose job includes making sure that the faculty avoid “implicit bias in the hiring process,” in the words of the engineering school’s diversity dean.

    These bureaucratic sinecures are premised on the idea that UCLA is rife with discrimination, from which an ever-growing number of victim groups need protection. The Intergroup Relations Office scours the horizon for “emerging social-identity-based intergroup conflicts,” according to its website. It has been hiring undergraduates and graduate students to raise their peers’ self-awareness of their “experiences with privilege and oppression.” These “diversity peer educators,” whose internship salaries come out of mandatory student fees, will host workshops on “toxic masculinity” and “intersectional identities” this fall. If UCLA is putting a comparable effort into organizing campus-wide workshops on the evolution of constitutional government or the significance of Renaissance humanism, it is keeping the effort out of sight.

    Reality check: UCLA and the University of California are among the most tolerant, welcoming environments in human history for all races, ethnicities and genders. Every classroom, library and scientific laboratory is open to all qualified students on an equal basis. Far from discriminating against underrepresented minorities in admissions, UCLA and UC have sought tirelessly to devise surrogates for the explicit racial preferences banned in 1996 by Proposition 209. UCLA’s proportion of black undergraduates — 5% in 2016 — is less than one percentage point below the black share of California’s public high school graduates.

    In 2016, 4% of UCLA’s faculty were black, 6.6% were Latino, 66% were white, and 18.6% were Asian. This distribution reflects the hiring pipeline, not hiring bias.

    Blacks made up 4.7% of all doctorate recipients nationwide in 2006, 4.9% in 2010, and 5.2% in 2016, according to the National Science Foundation. But black PhDs have historically been concentrated in education; in the sciences, which make up a large proportion of the UCLA faculty, less so. In 2016, for example, 1% of all PhDs in computer science went to blacks, or 17 out of 1,659 doctorates, according to the Computing Research Assn. Many fields — nuclear physics, geophysics and seismology and neuropsychology, for instance — had no black PhDs at all.

    Given such numbers, it is unrealistic to assume that every academic department at UCLA will perfectly mirror the state’s demographic makeup, absent discrimination. And yet the equity, diversity and inclusion office puts every member of a faculty search committee through time-consuming implicit bias training.

    The ultimate solution to any absence of proportional representation in higher education is to close the academic skills gap. In 2015, only 14% of black eighth graders in California and 13% of Latino eighth graders scored as proficient or above on the National Assessment of Educational Progress math test, compared with 57% of Asians and 43% of whites. In reading, 16% of black eighth graders and 18% of Latino eighth graders were proficient or above, compared with 50% of Asians and 44% of whites. Such gaps have been constant over many decades.

    It does not do UCLA’s students any favors to teach them to see bias where there is none. UC’s diversity bureaucracy is a costly diversion from the true mission of higher education: passing on to students, with joy and gratitude, the treasures of our cultural inheritance and expanding the boundaries of knowledge.

    Heather Mac Donald is the Thomas W. Smith fellow at the Manhattan Institute. Her latest book, “The Diversity Delusion,” goes on sale Tuesday.

              Healthcare BPO Market Trends, Drivers, Strategies, Applications and competitive Landscape 2023      Cache   Translate Page      
    (EMAILWIRE.COM, September 13, 2018 ) “Global Healthcare BPO Consumption Market Report 2018-2023” newly adds in Researchformarkets.com database. This report covers leading key company profiles with information such as business overview, regional analysis, consumption, revenue and specification. Healthcare...
              Knowledge Process Outsourcing Market: Key Company Profiles, Production Revenue, Product Picture and Specification 2025      Cache   Translate Page      
    (EMAILWIRE.COM, September 13, 2018 ) “Global Knowledge Process Outsourcing Market Size, Status and Forecast 2018-2025” newly adds in Researchformarkets.com database. This report covers leading key company profiles with information such as business overview, regional analysis, consumption, revenue...
              LAN as a Service Market Size, Production, Consumption, Import and Export Status and Forecast 2025      Cache   Translate Page      
    (EMAILWIRE.COM, September 13, 2018 ) “Global LAN as a Service Market Size, Status and Forecast 2018-2025” newly adds in Researchformarkets.com database. This report covers leading key company profiles with information such as business overview, regional analysis, consumption, revenue and specification. Network...
              Oracle RAC Database Administrator - McLane Advanced Technologies - Fort Lee, VA      Cache   Translate Page      
    The Oracle DBA will support day to day database operations of the Property Book Unit Supply Enhanced (PBUSE) system. Specifically, managing the back-end...
    From McLane Advanced Technologies - Mon, 13 Aug 2018 06:31:19 GMT - View all Fort Lee, VA jobs
              The Transformation of Gerald Baumgartner      Cache   Translate Page      
    Video: The Transformation of Gerald Baumgartner
    Watch This Video!
    Studio: Celebrity Video Distribution
    Gerald Baumgartner (Randall Malin) is a fastidious, delusional man wrapped in an idealized image of a business man ultra-organized, competent, logical...or so he thinks. Pleased with himself, Gerald trips through his life of routine and nth-degree organization until one day he discovers an entirely new vision of himself inspired by the beautiful, bohemian Christiana LaTierre (Melissa Fischer).

    Driven by this "new vision" Gerald sets into play a spectacular life transformation, which involves among other things engineering a seamless separation from his - "current situation" - his wife (Carolyn Koskan) - using sound principles of change management.

    Armed with project plans, databases of suitors for his "replacement", and bizarre decisions skills only Gerald could conceive, he executes his plan.

    The results are distasterous...and hilarious.

    Stars: Carolyn Koskan, Melissa Fischer, Randall Malin

              Mechanical Design Analyst 3 - Exelon Corporation - Cordova, IL      Cache   Translate Page      
    Maintain various design databases including PassPort, mechanical and electrical seal database. Must be able to manage and implement technical programs, has wide...
    From Exelon Corporation - Sun, 26 Aug 2018 05:19:22 GMT - View all Cordova, IL jobs
              Pathological, delusional, stupid or all three?      Cache   Translate Page      
    Pathological, delusional, stupid or all three?

    by digby



    All three. This was as of last week:

    In the 592 days since he took the oath of office, President Trump has made 4,713 false or misleading claims, according to The Fact Checker's database that analyzes, categorizes and tracks every suspect statement uttered by the president.

    That's an average of about eight claims a day.

    When we first started this project for the president's first 100 days, he averaged 4.9 claims a day. But the average number of claims per day keeps climbing as the president nears the 600-day mark of his presidency.

    In fact, in the past three months, the president averaged 15.4 claims a day, so almost one-third of the total claims made as president have come in this period. At that pace, he will top 5,000 claims in September.

    On July 5, the president had reached a new daily high of 79 false and misleading claims, but he came close on Aug. 30 with 73 claims, when he held a campaign rally and had an extended interview with Bloomberg News. On a monthly basis, June and August rank in first and second place during Trump's presidency, with 534 and 469 claims, respectively. July is in third place, with 448 claims.

    He can't help lying even when he doesn't need to, he is living in a Fox News induced delusion about his administration and Trump induced delusion about himself --- and he is dumb as a post. I think 5th or 6th grade level of knowledge is just about right which, when combined with his arrogant narcissism, means he also refuses to learn. All of this is obvious. It's not like he's discreet.

    And all these former administration officials who obviously cooperated with Woodward are even more cowardly than that op-ed writer and the sycophants like Huckabee-Sanders. They are out. They can tell it like it is. And they are still bowing and scraping for this dishonest imbecile by saying the book isn't "accurate."

    Why? What the hell does Gary Cohn have to lose by standing by his portrayal of Trump in the book which has been backed up by voluminous reporting by other reporters and what we see before our eyes every single day? Why is he groveling to get back into Trump's good graces?

    Does Rob Porter think
    he has anything to gain by holding on to his reputation as a wife beating Trump toady? Is his future somehow secured by that?

    The president's approval rating is in free-fall which indicates that a fair number of Americans are finally admitting that something is very, very wrong with him. And yet the only former staffer who has the guts to come forward to personally tell the story that is being anonymously leaked by dozens of staffers every day is the opportunistic Omarosa.

    At this point I have to agree with Trump about one thing: the people who work for him are all a bunch of sniveling cowards.

    Like this from yesterday:
    President Donald Trump spent the morning bragging about the economy. At least one of his claims didn't come close to being true.

    "The GDP Rate (4.2%) is higher than the Unemployment Rate (3.9%) for the first time in over 100 years!" the president said in a tweet.

    The first two numbers are correct, although they measure completely different things, and in different ways.

    The overall US economy grew at a 4.2% annual rate in the second quarter. Unemployment was between 3.8% and 4% during the quarter, and it came in at 3.9% in August.

    That's all good news.

    "It's definitely better when it's true than when it's not," said Justin Wolfers, professor of economics at University of Michigan. "I like high GDP growth and low unemployment."

    But Trump got it wrong — way wrong — when he said it hasn't happened in a century.
    In the last 70 years, it's happened in at least 62 quarters, most recently in 2006.

    "He wasn't even in the neighborhood of right," Wolfers said in an interview.

    Kevin Hassett, the chairman of the White House Council of Economic Advisers, acknowledged to reporters later in the day that the president's tweet was incorrect. He pointed out it was the first time in 10 years that GDP growth exceeded the unemployment rate.

    "And at some point, somebody probably conveyed it to him, adding a zero to that, and they shouldn't have done that," he said.

    Seriously???


    .





              B2B Web Marketing - HP - Houston, TX      Cache   Translate Page      
    Expertise and knowledge in the online space (web, email, search, database marketing, chat marketing, podcasting, blogging, privacy, e-business, etc), including...
    From HP - Wed, 20 Jun 2018 11:35:20 GMT - View all Houston, TX jobs
              stackous.com      Cache   Translate Page      

    A fabulous name that kicks off with 'stack' and soars high.

    stackous.com
    Keywords: 
    stacks, heaps, mounds, piles, towers, databases, elevated, towering, turntables, djs, computers, programming

              Senior Database Specialist - Unisys - Boyce, VA      Cache   Translate Page      
    Unisys has more than 23,000 employees serving clients around the world. Unisys is an Equal Opportunity Employer (EOE) - Minorities, Females, Disabled Persons,...
    From Unisys - Wed, 15 Aug 2018 08:01:00 GMT - View all Boyce, VA jobs
              Senior Database Developer - OSI Systems, Inc. - Madhapur, Hyderabad, Telangana      Cache   Translate Page      
    And (c) Optoelectronics and Manufacturing, providing specialized electronic components and electronic manufacturing services for original equipment...
    From OSI Systems, Inc. - Mon, 06 Aug 2018 05:41:37 GMT - View all Madhapur, Hyderabad, Telangana jobs
              Data Entry Operator      Cache   Translate Page      
    AZ-Phoenix, A Government Agency in Downtown Phoenix is looking for hard working and dedicated Data Entry Operator to join their growing team. This position is a long term temporary position, working hours Monday - Friday 8:00 am - 5:00 pm. The salary is $10.50 an hour Responsibilities: Entering information into computer databases for effective record keeping. Organizing files and collecting data to be entered
              schemacrawler (15.01.01)      Cache   Translate Page      
    Free database schema discovery and comprehension tool.

              SQL Database Developer      Cache   Translate Page      
    IA-West Des Moines, Job Title :SQL Database Developer Zip Code :50266 City :West Des Moines State :IA Needed: -5-7+ years of SQL Database development -5+ years' experience designing, maintaining, and supporting high available, high volume production systems. - Experience with Data Migration, specifically with Unix is required (Unix DMF - Data Migration Framework is nice to have) - Experience with SQL tables - loading
              Senior IT Security Specialist      Cache   Translate Page      
    IA-Grimes, CTG has a Senior IT Security Specialist DIRECT HIRE Permanent position located in Grimes, IA. This individual will utilize business knowledge and technical experience to: Continually protect information assets and brand image. Monitor and configure systems and logs related to IT security. Work with system, network, database, and development administrators; making recommendations on how to better s
              B2B Web Marketing - HP - Houston, TX      Cache   Translate Page      
    Expertise and knowledge in the online space (web, email, search, database marketing, chat marketing, podcasting, blogging, privacy, e-business, etc), including...
    From HP - Wed, 20 Jun 2018 11:35:20 GMT - View all Houston, TX jobs
              Accounting Clerk - State of Wyoming - Powell, WY      Cache   Translate Page      
    Searches and disburses fiscal information pertaining to processes documents. Maintains a segment of a fiscal system and multiple databases.... $12.60 - $15.75 an hour
    From State of Wyoming - Sat, 18 Aug 2018 08:50:34 GMT - View all Powell, WY jobs
              HVAC Controls Technician - LONG Building Technologies - Cody, WY      Cache   Translate Page      
    Program databases for all supported systems to meet specified sequences of operation and equipment manufacturers recommendations, in accordance with LONG...
    From LONG Building Technologies - Tue, 21 Aug 2018 22:49:45 GMT - View all Cody, WY jobs
              Data entry work hp      Cache   Translate Page      
    We are looking for a Data Entry operaor to update and information on your company database (Budget: $2 - $8 USD, Jobs: Data Entry)
              Butlin’s data breach affects 34,000 customers      Cache   Translate Page      

    Butlin’s has admitted that up to 34,000 of its customers may have been affected by a data breach. Managing Director Dermot King confirmed that Butlin’s’s database had been put at risk following “a phishing attack via an unauthorised email”.

    In a notice posted on Butlin’s’s website, King said: “We would like to assure all our guests that your payment details are secure and have not been compromised. Your Butlin’s usernames and passwords are also secure. The data which may have been accessed includes booking reference numbers, lead guest names, holiday arrival dates, postal and email addresses and telephone numbers. Our investigations have not found any evidence of fraudulent activity related to this event, but our data security experts will continue to work around the clock and have improved a number of our security processes.”

    No financial data compromised

    Although it may be comforting to know that payment details were not compromised, this type of data breach can have wide-reaching implications. Customers may no longer wish to take their holiday and leave their homes unattended, knowing that their holiday arrival dates and postal addresses might have been compromised. Cancelling a holiday, requesting a refund and possibly making new arrangements is time-consuming and disappointing. There’s also that uneasy feeling knowing that your details are ‘out there’ and could be used in other ways.

    At a company level, as well as loss of revenue through cancellations, reputational damage and loss of trust can have long-term consequences especially for an organisation known for being family-focused. Butlin’s will need a business continuity plan to minimise the damage.

    Greatest single point of weakness

    This data breach might have been avoided with staff training and awareness, but phishing attacks are on the rise and are increasingly sophisticated, which makes them harder to spot especially if people are busy and don’t really look before opening an attachment or clicking a link. So, what can be done to ensure your systems are as safe as they can be, and that you are prepared to deal with a breach?

    Cyber attackpreparation

    To help prepare for acyber attack, organisations should implement an ISMS (information security management system). ISO 27001 is the international standard that describes best practice for an ISMS. Achieving certification to ISO 27001 demonstrates to existing and potential customers that an organisation has defined and put in place best-practice information security measures and processes.

    How vsRisk
    Butlin’s data breach affects 34,000 customers
    helps organisations prepare for ISO 27001 certification

    You could invest time, effort and money in designing and deploying or have a consultant design and deploy a manual risk assessment methodology. Or save yourself a lot of time (80%) and money by deploying our risk assessment software tool, vsRisk, instead.

    Save time and money this September with Vigilant Software

    We have special offers on our software tool vsRisk : purchase the ISO 27001 ISMS Documentation Toolkit , vsRisk and one year’s support to save up to 400.

    With vsRisk you can produce consistent , robust and reliable risk assessments year after year . It is fully aligned with ISO 27001 and will help you systematically identify, evaluate and analyse risks without feeling overwhelmed.

    The offers are: vsRisk Standalone package is now only 2,040 , saving 350

    With this package you have access to vsRisk Standalone (single user), the ISO 27001 ISMS Documentation Toolkit and one year’s support.

    vsRisk Multi-user package is now only 3,790 , saving 400

    With this package you have access to vsRisk Multi-user (for up to ten users), the ISO 27001 ISMS Documentation Toolkit and one year’s support.

    Theses offers are available until the end of September. The ISO 27001 ISMS Documentation Toolkit contains useful documents that can help you with policies, procedures, processes, work instructions, forms and records. It is based on real-world best practice and experience that will significantly aid implementation projects .

    Buy online today and save!

    For further information and to sign up for a demo, please click here .

    *** This is a Security Bloggers Network syndicated blog from Vigilant Software Blog authored byNicholas King. Read the original post at: https://www.vigilantsoftware.co.uk/blog/butlins-data-breach-affects-34000-customers/


              Report Finds Government and Military Employees Use Weak Passwords      Cache   Translate Page      

    WatchGuard Technologies' Internet Security Report for Q2 2018 states that more than50% of military and government employees use weak passwords after analyzing the data leaked from LinkedIn in 2012.

    According to their research, after analyzingpasswords associated with 355,023 government (.gov) and military (.mil) accounts from a 117 million encoded database of passwords stolen fromLinkedIn, over 50% of them were crackable in less than two days.

    Furthermore, even though all government security training programs ask employees to use complex passwords to avoid providing hackers with an easy to exploit attack vector, the most common passwords throughoutthe analyzed database were"123456," "password," "linkedin," "sunshine," and "111111."

    Granted, the dataset analyzed by the Threat Lab team comes from six-year-oldleak published online two years ago, but knowing what other research teams have found out about the passwords exposed in multiple other leaks in the past few years, the statistics most probably still hold out.

    Researchers adviseorganization to implement multi-factor authentication solutions

    TheWatchGuard researchers also say that if the chosen passwords were at least medium-strength and not your run-of-the-mill "security codes," the time needed to crack them would have exponentially increased from a few hoursto weeks and even years for strong passwords.

    "These findings further illustrate the need for stronger passwords for everyone, and a higher standard for security among public service employees that handle potentially sensitive information," says WatchGuard's report .

    Moreover, the research team adds that besides better training of government employees in choosing stronger passwords, both state and privately-held organizations must use multi-factor authentication to bring down the prevalence of security incidents due to brute force attacks.

    WatchGuard also found out that in over 75% of all malware attacks are performed over the web via HTTP/HTTPS, with brute force login placed on the fourthplace.

    Threat actors use substantial numbers of login attempts in the hope of breaking in Internet-facing systems which can lead to disastrous credential exfiltration and significant losses over time.


              UIDAI dismisses reports about the alleged hack of Aadhaar enrolment software      Cache   Translate Page      

    The Unique Identification Authority of India (UIDAI) on 11 September dismissed reports of hacking of Aadhaar enrolment software as 'completely incorrect and irresponsible' and said some vested interests were deliberately trying to create confusion among people.

    The denial came after an investigation by HuffPost India revealed that the Aadhaar database, which contains the biometrics and personal information of over one billion Indians, ' had been compromised by a software patch which disables critical security features of the software used to enrol new Aadhaar users'.

    According to the report, any unauthorised person from anywhere in the world can generate Aadhaar ID using the patch which is freely available for Rs 2,500.


    UIDAI dismisses reports about the alleged hack of Aadhaar enrolment software

    Woman using an iris scanner for UIDAI Aadhaar registration. Image: Reuters

    The UIDAI said the claims about Aadhaar being vulnerable to tampering lacked substance and were totally baseless.

    "Certain vested interests are deliberately trying to create confusion in the minds of people which is completely unwarranted," a statement issued by the organisation said.

    It added that the UIDAI matches all the biometric (10 fingerprints and both iris) of a resident enrolling for Aadhaar with the biometrics of all Aadhaar holders before issuing the unique ID.

    "UIDAI has taken all necessary safeguard measures spanning from providing standardized software that encrypts entire data even before saving to any disk, protecting data using tamper proofing, identifying every one of the operators in every enrolment, identifying every one of thousands of machines using a unique machine registration process, which ensures every encrypted packet is tracked," the statement said.

    #PressStatement

    UIDAI hereby dismisses a news report appearing in social and online media about Aadhaar Enrolment Software being allegedly hacked as completely incorrect and irresponsible. 1/n

    ― Aadhaar (@UIDAI) 11 September 2018

    It said all measures to ensure end-to-end security of resident data were taken including full encryption of resident data at the time of capture, tamper resistance, physical security, access control, network security, stringent audit mechanism, round the clock security and fraud management system monitoring.

    Earlier, a report by HuffPost said a software patch available for as little as Rs 2,500 lets a user bypass critical security features such as biometric authentication of enrolment operators to generate unauthorised Aadhaar numbers. It said the patch also disables the GPS security feature of the software allowing anyone from any location to enrol users.

    UIDAI clarified that no operator can make or update Aadhaar unless resident himself gives his biometric.

    "Any enrolment or update request is processed only after biometrics of the operator is authenticated and resident's biometrics is de-duplicated at the backend of UIDAI system," it said.

    It added that as part of its 'stringent' enrolment and updation process, UIDAI checks enrolment operator's biometric and other parameters before processing the enrolment or updates and only after all checks are found to be successful, enrolment or update of aresident is further processed.

    "Therefore it is not possible to introduce ghost entries into Aadhaar database."

    UIDAI said that even in a hypothetical situation where a ghost enrolment or update packet is sent to the UIDAI by some 'manipulative attempt', the same is identified by the robust back-end system and all such enrolment packets get rejected and no Aadhaar is generated.

    "Also, the concerned enrolment machines and the operators are identified, blocked and blacklisted permanently from the UIDAI system. In appropriate cases, police complaints are also filed for such fraudulent attempts," it said.

    If an operator is found violating UIDAI’s strict enrolment and update processes or if one indulges in any type of fraudulent or corrupt practices, UIDAI blocks and blacklists them and imposes financial penalty upto Rs.1 lakh per instance. 20/n

    ― Aadhaar (@UIDAI) September 11, 2018

    UIDAI said that the reported claim of "anybody is able to create an entry into Aadhaar database, then the person can create multiple Aadhaar cards" is completely false.

    "If an operator is found violating UIDAI's strict enrolment and update processes or if one indulges in any type of fraudulent or corrupt practices, UIDAI blocks and blacklists them and imposes financial penalty upto Rs 1 lakh per instance. It is because of this stringent and robust system that as on date more than 50,000 operators have been blacklisted," UIDAI added.

    It said that it keeps adding new security features in its system as required from time-to-time to thwart new security threats by unscrupulous elements.


              New Security Research Reveals Password Inadequacy Still a Top Threat      Cache   Translate Page      

    WatchGuard’s Q2 2018 Internet Security Report uncovers heightened use of credential-focused attacks and continued prevalence of malicious Office documents

    12 September 2018 New research from the WatchGuard Threat Lab shows the emergence of the Mimikatz credential-stealing malware as a top threat and the growing popularity of brute force login attacks against web applications. The research also reveals that 50 percent of government and military employee LinkedIn passwords, largely from the US, were weak enough to be cracked in less than two days, underscoring the reality that passwords alone can’t offer sufficient protection and the need for multi-factor authentication (MFA) solutions. WatchGuard’s Internet Security Report for Q2 2018 explores the latest security threats affecting small to midsize businesses (SMBs) and distributed enterprises.


    New Security Research Reveals Password Inadequacy Still a Top Threat

    Corey Nachreiner

    “Authentication is the cornerstone of security and we’re seeing overwhelming evidence of its critical importance in the common trend of password- and credential-focused threats throughout Q2 2018,” said Corey Nachreiner, chief technology officer at WatchGuard Technologies. “Whether it’s an evasive credential-stealing malware variant or a brute force login attack, cyber criminals are laser-focused on hacking passwords for easy access to restricted networks and sensitive data. At WatchGuard, these trends are driving new innovative defences within our product portfolio, including AuthPoint, our Cloud-based multi-factor authentication solution and our IntelligentAV service, which leverages three malware detection engines to prevent malware strains that evade traditional signature-based antivirus products. Every organisation should seek out vendor and solution provider partners that offer layered protection against these ever-evolving attack techniques.”

    The insights, research and security best practices included in WatchGuard’s quarterly Internet Security Report are designed to help organisations of all sizes understand the current cyber security landscape and better protect themselves, their partners and customers from emerging security threats. The top takeaways from the Q2 2018 report include:

    Mimikatz was the most prevalent malware variant in Q2. Representing 27.2 percent of the top 10 malware variants listed last quarter, Mimikatz is a well-known password and credential stealer that has been popular in past quarters but has never been the top strain. This surge in Mimikatz’s dominance suggests that authentication attacks and credential theft are still major priorities for cyber criminals another indicator that passwords alone are inadequate as a security control and should be fortified with MFA services that make hackers’ lives harder by requiring additional authentication factors in order to successfully login and access the network.

    Roughly half of government and military employee passwords are weak. After conducting a thorough analysis of the 2012 LinkedIn data dump to identify trends in user password strength, WatchGuard’s Threat Lab team found that half of all passwords associated with “.mil” and “.gov” email address domains within the database were objectively weak. Of the 355,023 largely US government and military account passwords within the database, 178,580 were cracked in under two days. The most common passwords used by these accounts included “123456,” “password,” “linkedin,” “sunshine,” and “111111.” Conversely, the team found that just over 50 percent of civilian passwords were weak. These findings further illustrate the need for stronger passwords for everyone, and a higher standard for security among public service employees that handle potentially sensitive information. In addition to better password training and processes, every organisation should deploy multi-factor authentication solutions to reduce the risk of a data breach.

    More than 75 percent of malware attacks are delivered over the web. A total of 76 percent of threats from Q2 were web-based, suggesting that organisations need an HTTP and HTTPS inspection mechanism to prevent the vast majority of attacks. Ranked as the fourth most prevalent web attack in particular, “WEB Brute Force Login -1.1021” enables attackers to execute a massive deluge of login attempts against web applications, leveraging an endless series of random combinations to crack user passwords in a short period of time. This attack in particular is another example of cyber criminals’ heightened focus on credential theft and shows the importance of not only password security and complexity, but the need for MFA solutions as a more effective preventative measure.

    Cryptocurrency miners earn spot as a top malware variant. As anticipated, malicious cryptominers are continuing to grow in popularity as a hacking tactic, making their way into WatchGuard’s top 10 malware list for the first time in Q2. Last quarter, WatchGuard uncovered its first named cryptominer, Cryptominer.AY, which matches a javascript cryptominer called “Coinhive” and uses its victims’ computer resources to mine the popular privacy-focused cryptocurrency, Monero (XRM). The data shows that victims in the United States were the top geographical target for this cryptominer, receiving approximately 75 percent of the total volume of attacks.

    Cyber criminals continue to rely on malicious Office documents. Threat actors continue to booby-trap Office documents, exploiting old vulnerabilities in the popular Microsoft product to fool unsuspecting victims. Interestingly, three new Office malware exploits made WatchGuard’s top 10 list, and 75 percent of attacks from these attacks targeted EMEA victims, with a heavy focus on users in Germany specifically.

    The complete Internet Security Report features an in-depth analysis of the EFail encryption vulnerability, along with insights into the top attacks in Q2 and defensive strategies SMBs can use to improve their security posture. These finding are based on anonymized Firebox Feed data from nearly 40,000 active WatchGuard UTM appliances worldwide, which blocked nearly 14 million malware variants (449 per device) and more than 1 million network attacks (26 per device) in Q2 2018.

    For more information, download the full report here https://www.watchguard.com/wgrd-resource-center/security-report-q2-2018 . To access live, real-time threat insights by type, region and date, visit WatchGuard’s Threat Landscape data visualization tool today. Subscribe to The 443 Security Simplified podcast at Secplicity.org , or wherever you find your favorite podcasts.

    About WatchGuard Technologies

    WatchGuard Technologies, Inc. WatchGuard Technologies, Inc. is a global leader in network security, secure Wi-Fi, multi-factor authentication, and network intelligence. The company’s award-winning products and services are trusted around the world by nearly 10,000 security resellers and service providers to protect more than 80,000 customers. WatchGuard’s mission is to make enterprise-grade security accessible to companies of all types and sizes through simplicity, making WatchGuard an ideal solution for distributed enterprises and SMBs. The company is headquartered in S
              Generally Disclosing Pretty Rapidly: GDPR strapped a jet engine on hacked Britis ...      Cache   Translate Page      

    AnalysisIf Equifax's mother-of-all-security-disasters last year underlined one thing, it was that big companies think they can weather just about anything cybercriminals and regulators can throw at them.

    One unpatched web server, 147 million mostly US customer records swiped, and a political beating that should pulverise a company’s reputation for good (“one of the most egregious examples of corporate malfeasance since Enron,” said US Senate Democratic leader Chuck Schumer), and yet Equifax is not only still standing but perhaps even thriving.

    While it’s true the full financial consequences yet to unfold, it’s hard not to notice that its shares last week rode back to within spitting distance of where they were before the breach was made public.

    It all stands in fascinating contrast to what is happening in the UK and Europe, where the mood over database security breaches is darkening. It’s not that there are necessarily more of them so much as the speed with which they are being revealed.

    Last week’sBritish Airways hack makes an interesting case study, not simply because of the technically embarrassing fact cybercriminals were able to skim up to 380,000 transactions in real time but the speed with which the company owned up to the calamity.

    Confessions

    According to BA, the attack began at 22.58 BST on August 21, and was stopped at 21:45 BTS on September 5. This meant BA had taken 15 days to notice hackers were grabbing its customers’ card numbers, but under 24 hours to tell the world via Twitter and email a contender for a world record for computer security breach confessions.

    Security analysts RiskIQ have speculated that the same gang was behind June’s Ticketmaster web breach , which took a still fairly rapid five days to surface after being discovered on June 23. Perhaps the best example of how the security breach atmosphere is changing is T-Mobile US, which uncovered miscreants slurping account records of 2.2 million customers on August 20 and revealed that fact only four days later.

    Compare this haste to Equifax, which detected its breach on July 29 last year, but only told the world months later on September 7.

    Why the sudden hurry? In the case of BA, officially, the answer is Article 33 of Europe's GDPR , under which cyber-break-ins involving personal data must be reported within 72 hours. Security breaches are now understood as having their own lifecycle. At the user end, a recent report from EMW Law LLP found that complaints to the UK's Information Commissioner after May’s GDPR launch reached 6,281, a doubling compared to the same period in 2017.


    Generally Disclosing Pretty Rapidly: GDPR strapped a jet engine on hacked Britis ...
    British Airways hack: Infosec experts finger third-party scripts on payment pages READ MORE

    “This is definitely due to the awareness and the run up to the GDPR,” agreed Falanx Group senior data protection and privacy consultant Lillian Tsang. But there’s more to it than that. “Reporting a breach shows awareness, the notion of “doing” something even if the breach cannot be mitigated quick enough. It does show pragmatism, rather than a reactive stance of yesteryears.”

    Breaches will never become just another battle scar to be marked up to experience they are too serious and expensive for that no matter what the shareholders think when share prices recover. What is becoming stressful is the speed of disclosure.

    “Crisis management is a relatively new yet vitally important area to focus on. As more chief staff realise that it’s a case of when rather than if a breach occurs, it is highly possible that more businesses have a ready-made crisis procedure waiting for a potential strike,” said ESET security specialist, Jake Moore.

    As the breaches keep coming however, he believes an example will eventually be made of someone. “The ICO are likely to want to stick the GDPR message to a high-profile company to show its magnitude and therefore companies are ready to show that they are more compliant than ever before.”

    It could be that BA’s rapid breach disclosure has set the benchmark at the sort of uncomfortable standard many, including its competitors, will struggle to match.

    Sponsored: Following Bottomline’s journey to the Hybrid Cloud


              Jewish collector’s descendant gets Nazi-looted Renoir back      Cache   Translate Page      

    NEW YORK – The granddaughter of a Jewish art collector whose paintings were stolen by the Nazis had a family reunion with one of the works on Wednesday after almost eight decades, an impressionist piece by Pierre-Auguste Renoir.

    Sylvie Sulitzer saw “Two Women in a Garden” for the first time at New York’s Museum of Jewish Heritage after unveiling it at a ceremony that included law enforcement officials representing the offices that helped get the painting back to her, her grandparents’ only living descendant.

    “I’m very thankful to be able to show my beloved family, wherever they are, that after what they’ve been through, there is justice,” Sulitzer said tearfully.

    The reunion, though, will probably be short-lived. She will likely auction off the painting to pay back compensation she previously got for missing artwork.

    She was joined by Geoffrey Berman, the U.S. attorney for Manhattan, and William Sweeney Jr., the assistant director in charge of the New York office of the FBI.

    Sulitzer’s grandfather, Alfred Weinberger, was an art collector in Paris. Sulitzer said he fled the city to avoid being pressed into service by the Nazis for his art expertise.

    He put some of his paintings in a bank vault before fleeing the Nazis, who took possession of the works in December 1941. The Nazis made a regular practice of looting artworks and other items of cultural and financial significance, and in the decades since World War II, efforts have been made to find the objects and return them to their owners if possible, with varying levels of success.

    Weinberger died when Sulitzer, now 59, was a teenager, without ever getting the Renoir and a handful of other paintings returned to him. She had no idea of the paintings’ existence, Sulitzer said, since they weren’t discussed in her family.

    “The war was a taboo subject; we never talked about that,” said Sulitzer, who owns a delicatessen in the south of France near where she lives in Roquevaire.

    But Weinberger had registered his missing property with authorities, and it was included in a database that had gone online in 2010 of looted art, based on records compiled by the Nazis themselves of what they had amassed.

    Sulitzer learned in 2013 that the painting, which had surfaced periodically through the decades at various auctions, was once again up for auction. Her attorneys contacted the auction house, which in turn went to the FBI division that looks into situations of this sort.

    The painting had been all over the world in the years since the Nazis took hold of it, including Johannesburg, London and Zurich, said Sweeney.

    “The extraordinary journey this small work of art has made around the globe and through time ends today,” he said.

    The owner of the piece voluntarily gave it up to be returned to Sulitzer, officials said.

    The painting is on display at the museum through Sunday and will go back to Sulitzer’s possession after that. She wasn’t sure for how long, though – she has to pay back some money from the French and German governments she got in connection with the stolen works, since one of them has been returned, and she said she can’t afford to and will likely auction off the painting.

    Despite that, she said, she was thrilled to have it back, saying it was important for the memory of her family, and she thought her grandfather would consider it justice.

    “I would have loved him to be here, instead of me,” Sulitzer said.

    She wished other families looking for their own lost works to be as lucky as she has been.

    “I hope everybody will, one day or another, have the justice as I had,” she said.


              Database Technologies - Accenture - Bengaluru, Karnataka      Cache   Translate Page      
    Accenture Technology powers our clients’ businesses with innovative technologies—established and emerging—changing the way their people and customers experience...
    From Accenture - Wed, 12 Sep 2018 19:46:48 GMT - View all Bengaluru, Karnataka jobs
              stellarium 0.18.2-2 x86_64      Cache   Translate Page      
    A stellarium with great graphics and a nice database of sky-objects
              Fragment – Running Multiple Services, such as Jupyter Notebooks and a Postgres Database, in a Single Docker Container      Cache   Translate Page      
    Over the last couple of days, I’ve been fettling the build scripts for the TM351 VM, which typically uses vagrant to build a VirtualBox VM from a set of shell scripts, so they can be used to build a single Docker container that runs all the TM351 services, specifically Jupyter notebooks, OpenRefine, PostgreSQL and MongoDB. Docker […]
              Cruising Systems Analyst - BC Ministry of Forests, Lands, Natural Resource Operations and Rural Development - Victoria, BC      Cache   Translate Page      
    Tourism &amp; Immigration. Preference may be given to candidates with experience with cruise compilation programs and Ministry databases such as ECAS, FTA, RESULTS,... $56,479 - $64,338 a year
    From Canadian Forests - Wed, 29 Aug 2018 03:13:24 GMT - View all Victoria, BC jobs
              Syndicate Coodinator      Cache   Translate Page      
    Requisition ID: 33602 Join the Global Community of Scotiabankers to help customers become better off. Day-to-day responsibilities New issue administrator responsible for deal documentation, deal entry, and deal co-ordination. Updating Microsoft excel and Access based models, and emailing marketing material internally/ externally. Assist in the dissemination of the desk’s marketing initiatives (internal and external). Setting up daily conference calls Act as ‘investor - issuer relations’ coordinator for roadshows and presentations, from concept stage to execution. Producing PowerPoint presentations, and taking initiative in designing slides and graphics to best represent data Design queries to extract info and derive conclusions from raw data Maintain databases and scheduled tasks Perform daily account reconciliations and other administrative tasks tied to accounting and settlement Prepare
              Vacancies at Workforce Management and Consultancy, September 2018      Cache   Translate Page      
    Vacancies at Workforce Management and Consultancy, September 2018
    Job Description
    Job title: Accountant
    Location: Tanzania
    Industry: Transport and Logistics

    Key Responsibilities

    • Accurate processing of invoices
    • Track expenses and process expense reports
    • Prepare and process electronic transfers and payments
    • Reconcile accounts payable and accounts receivable transactions
    • Monitor accounts to ensure payments are up to date
    • Maintain vendor files, correspond with vendors and respond to inquiries

    • Produce monthly reports, and assist with month end closing
    • Develop and Monitor all internal control procedures

    • Provide supporting documentation for audits
    • Post customer payments by recording cash and cheque transactions.
    • Ensure timely collection of payments
    • Prepare monthly, quarterly, annual and ad-hoc forecasting reports
    • Organize records of invoices, bills and deposits
    • Updating customer records and issuing monthly customer statements
    • Ensure high-quality invoicing and collection procedures that comply with the law
    Academic qualifications and Work experience

    Essential
    • BCom degree in Accounting/Finance.
    • ACCA, CPA holder
    • At least 3 years’ of proven work experience as an Accounts Receivable Clerk
    • Hands-on experience with accounting software
    • Familiarity with advanced formulas in MS Excel

    Skills
    • Trustworthiness, and have a sense of integrity
    • Attention to detail, strong analytical and problem-solving skills
    • Proactive, confident, assertive, to have team management and negotiation skills
    • Strong people management
    • Ability to work under pressure and within deadlines time.
    • Strong communication and administration skills
    • Independent worker
    • Computer Literacy

    Applications: Send your CV to; cv@workforceconsult.com

    Deadline: 19th September 2018
    ===========

    Job Description
    Job title: Human Resources and Administrative Officer.
    Location: Dar-es-Salaam
    Industry: Transportation and Logistics

    Role Purpose
    • To assist in coordinating all day to day Activities in the area and provide support for efficient operations of the business.
         
    Key Responsibilities
    • Ensuring the company complies with Employment and Labour Relations Act
    • Drawing up, negotiating and administering employee contracts
    • Overseeing orientation and on the job staff development and training 
    • Maintaining HR and employeerecords whileensuring quality and accuracy database
    • Oversee the company’s recruitment, interview, selection and hiring Process.
    • Administering payroll in close liaison with the Accounts Department.

    • Overseeing staff welfare and benefit programs.
    • Handling staff grievance and disciplinary matters
    • Must be able to provide equality and diversity as part of the culture of the company
    • Conducting performance appraisal to assess employee productivity
    • Preparing Human Resources reports for the Area
    • Overseeing employees healthy and safety.
    • Performing any other duties assigned by the Management.
    • Providing Administrative support to the company

    Academic qualifications
    Essential
    • Bachelor’s degree in Human resources or related field.

    Work Experience and Skills
    • 2+ years’ experience in related field.
    • Self-motivated, Accurate and detail-oriented
    • must possess excellent communication, organization and prioritization skills
    • Critical thinker and problem-solving skills
    • Comfortable with Microsoft office 
    • Great interpersonal skills
    • Knowledge of Tanzania Labour Law
    • Have the ambition to help drive the growth of a young company.



    Applications: Send your CV to; cv@workforceconsult.com

    Deadline: 19TH September 2018
    ==========

    Job Description
    Job title: Operations Manager
    Location: Tanzania
    Industry: Transportation and Logistics

    Key Responsibilities
    • Maintain absolute customer focus, keeping the customer fully informed on progress and to effectively manage their expectations through accurate tracking reports and constant liaison which is relevant, reliable, reactive and convenient.
    • Deal with customer complaints rapidly and professionally.
    • Accurate Journey Management financial control and planning ensuring timely cash flow to drivers and all associated service providers.

    • Accurate fuel allocation and accounting in accordance with Company Procedures.
    • Ensure drivers are correctly briefed, managed and lead throughout the journey and all security and risks are assessed. 
    • Ensure clear and concise vehicle fault reporting, ensure immediate and thorough accident/incident investigations are carried out.
    • Diligent driver management, ensuring driver KPIs are accurately assessed and recorded, while ensuring all areas for improvement are fully investigated and rectified in accordance with company procedures.

    • Trip monitoring and vehicle tracking is to be carried out diligently and consistently, ensuring there are no avoidable delays and drivers are well informed on their progress and adherence to the journey management plan.
    • SAP and TMS online management systems are to be correctly completed, accurately updated and timely
    • Ensure all trips are legally compliant, all documentation requirements are met and documentation required for invoicing are processed correctly. 
    Academic qualifications and Work experience

    Essential
    • Bachelor's Degree in any relevant discipline 
    • 5 + years of experience in Transport and Logistics service support
    • Experience in Sales, working within logistics, transportation, and fleet management.

    Skills
    • Strong Project management skills
    • Leadership and Problem-solving skills
    • Excellent, interpersonal, communicative and negotiation skills
    • Proficient in Microsoft Office software including Excel, Word, and PowerPoint
    • Proficient at equipment distribution and warranty
    • Exceptional team development and organizational skills
    • Health and Safety accreditations 

    Applications: Send your CV to; cv@workforceconsult.com

    Deadline: 19TH September 2018
    ===========

    Job Description
    Job title: Work shop Manager
    Location: Dar-es-Salaam
    Industry: Transportation and Logistics

    Key Responsibilities
    • Prepare daily reports and supervisemechanics and other subordinates
    • Ensure adequate availability of inventory of physical equipment and supplies and maintain records
    • Ensure the smooth execution of the day to day operations of the Workshop, including turnaround and quality
    • Increase productivity, assess capacity and flexibility while minimizing unnecessary costs and maintaining high quality standards.

    • Ensuring the fleet is maintained to and in compliance with Company Fleet Management Standards
    • Provide expert and accurate advice on technical matters
    • Adhere to all compliance requirements including supporting paperwork
    • Manage all aspects of customer service to achieve quality outcomes
    • Manage the communication flow between direct reports and management and ensure people management policies are applied

    • Champion a safety culture within the team and drive zero incidents in the workplace
    Work Experience and Skills
    • 5+ years’ experience in Vehicle/Fleet Maintenance
    • Leading and supervising skills
    • Ability to operate all tools and equipment involved.

    • Solid knowledge of and experience with Trucks and Trailer maintenance
    • Demonstrated understanding, and the ability to apply quality control techniques
    • Prior experience leading and managing a team in a similar environment
    • Sound computer knowledge.

    Applications: Send your CV to; cv@workforceconsult.com

    Deadline: 19TH September 2018

              Credit Analyst Sr - United Bank - Morgantown, WV      Cache   Translate Page      
    Perform online research of public records including, but not limited to, assessors databases, county register’s databases, and Secretary of State database for...
    From United Bank - Tue, 31 Jul 2018 11:24:55 GMT - View all Morgantown, WV jobs
              Database Administrator - iQmetrix - Winnipeg, MB      Cache   Translate Page      
    Flexibility and the ability to adapt to an evolving environment will go a long way at iQmetrix. IQmetrix has rated among the Top 50 Best Small &amp; Medium...
    From iQmetrix - Wed, 25 Jul 2018 22:31:15 GMT - View all Winnipeg, MB jobs
              QA Inspector - ICS/QAD/QAI/ES - ST Electronics (Info-comm Systems) Pte Ltd - Ang Mo Kio      Cache   Translate Page      
    Posting of inspection results to SAP &amp; IQC database. Perform IQC inspection &amp; testing of incoming parts....
    From Singapore Technologies Electronics - Fri, 06 Jul 2018 06:27:56 GMT - View all Ang Mo Kio jobs
              Database Administrator - iQmetrix - Regina, SK      Cache   Translate Page      
    Last year’s iQmetrix Odyssey trip. Top 10 Reasons to Join iQmetrix (view the full list). Our YouTube channel including numerous videos on iQmetrix and what we...
    From iQmetrix - Wed, 25 Jul 2018 16:31:10 GMT - View all Regina, SK jobs
              Production Planning/Procurement Coordinator - Samuel, Son & Co. - Cambridge, ON      Cache   Translate Page      
    Experience in manufacturing indirect supply chain. Knowledge of Microsoft Office applications and databases....
    From Indeed - Tue, 28 Aug 2018 16:29:19 GMT - View all Cambridge, ON jobs
              Moon rock hunter closes in on tracking down missing stones      Cache   Translate Page      

    SALT LAKE CITY – A strange thing happened after Neil Armstrong and the Apollo 11 crew returned from the moon with lunar rocks: Many of the mementos given to every U.S. state vanished. Now, after years of sleuthing, a former NASA investigator is closing in on his goal of locating the whereabouts of all 50.

    In recent weeks, two of the rocks that disappeared after the 1969 mission were located in Louisiana and Utah, leaving only New York and Delaware with unaccounted-for souvenirs.

    Attorney and moon rock hunter Joseph Gutheinz says it “blows his mind,” that the rocks were not carefully chronicled and saved by some of the states that received them. But he is hopeful the last two can be located before the 50th anniversary of the Apollo 11 mission next summer.

    “It’s a tangible piece of history,” he said. “Neil Armstrong’s first mission … was to reach down and grab some rocks and dust in case they needed to make an emergency takeoff.”

    President Richard Nixon’s administration presented the tiny lunar samples to all 50 states and 135 countries, but few were officially recorded and most disappeared, Gutheinz said.

    Each state got a tiny sample encased in acrylic and mounted on a wooden plaque, along with the state flag. Some were placed in museums, while others went on display in state capitols.

    But almost no state entered the rocks collected by Armstrong and fellow astronaut Buzz Aldrin into archival records, and Gutheinz said many lost track of them.

    When Gutheinz started leading the effort to find them in 2002, he estimates 40 states had lost track of the rocks.

    “I think part of it was, we honestly believed that going back to the moon was going to be a regular occurrence,” Gutheinz said.

    But there were only five more journeys before the last manned moon landing, Apollo 17, in 1972.

    Of the Apollo 11 rocks given to other countries, about 70 percent remain unaccounted for, he said.

    The U.S. government also sent out a second set of goodwill moon rocks to the states and other nations after the Apollo 17 mission, and many of those are missing as well, he said.

    NASA did not track their whereabouts after giving them to the Nixon administration for distribution, said chief historian Bill Barry, but added the space agency would be happy to see them located.

    Gutheinz began his career as an investigator for NASA, where he found illicit sellers asking millions for rocks on the black market. Authentic moon rocks are considered national treasures and cannot legally be sold in the U.S., he said.

    He became aware while at NASA that the gifts to the states were missing, but only began his hunt after leaving the agency.

    Now a lawyer in the Houston area, he’s also a college instructor who’s enlisted the help of his students. They record their findings of the whereabouts of the discovered moon gems in a database.

    Many of the Apollo 11 rocks have turned up in unexpected places: with ex-governors in West Virginia and Colorado, in a military-artifact storage building in Minnesota and with a former crab boat captain from TV’s “Deadliest Catch” in Alaska.

    In New York, officials who oversee the state museum have no record of that state’s Apollo 11 rock. In Delaware, the sample was stolen from its state museum on Sept. 22, 1977. Police were contacted, but it was never found.

    The U.S. Virgin Islands territory, meanwhile, cannot confirm that they ever received a goodwill rock, though the University of the Virgin Islands later received Apollo 11 rocks for scientific research, said chief conservator Julio Encarnacion III.

    In other states, though Gutheinz has recently hit paydirt. The Advocate newspaper in Baton Rouge located Louisiana’s Apollo 11 moon rock in early August after a call from Gutheinz.

    In Utah, the division of state history had no record of the sample, but The Associated Press confirmed it was in storage at Salt Lake City’s Clark Planetarium.

    Officials there may bring it out as part of celebrations recognizing the Apollo 11 anniversary next year, something Gutheinz hopes to see everywhere.

    “The people of the world deserve this,” he said. “They deserve to see something that our astronauts accomplished and be a part it.”


              Blog Installation and database development      Cache   Translate Page      
    Please do chat me up for more details about the job (Budget: $750 - $1500 USD, Jobs: Database Development, Database Programming, HTML, Website Design, WordPress)
              Build me an app with a database      Cache   Translate Page      
    We want to build an employment database where carers, nurses and cleaners can upload their details about their ability to work. Details would include cv alongside other materials and their available times... (Budget: £100 - £10000 GBP, Jobs: Database Administration, Mobile App Development, MySQL, PHP, User Interface / IA)
              România-Arad: Echipamente medicale      Cache   Translate Page      
    Spital Clinic Județean de Urgență Arad Str. Andrenyi Karoly nr. 2–4 Arad 310031 Ec. Diana Lina Ec. Diana Lina +40 257211233 scjuarad.bap@gmail.com +40 257211233 www.scjarad.ro www1.e-licitatie.ro Unitate sanitară cu paturi Achiziționare echipamente medicale și dispozitive pentru uz medical (25 loturi) S.C.J.U. Arad. Obiectul achizitiei vizeaza inclusiv activitati de instalare, punere în functiune, instruirea personalului în scopul operarii cu echipamentele medicale achizitionate si service în perioada de garantie a echipamentelor medicale. În cadrul procedurii, se vor achizitiona urmatoarele echipamente medicale, în urma defalcarii obiectului procedurii de achizitie publica pe loturi: — Lot 1 – Cardiotocograf (2 bucati — Sectia Obstretica Ginecologie), — Lot 2 – Ecograf Doppler 3D (1 bucata — Sectia Cardiologie), — Lot 3 – Ecograf portabil (1 bucata — Sectia Neurologie), — Lot 4 – Monitor functii vitale (1 bucata — Neurologie), — Lot 5 – Ecograf stationar (1 bucata — Sectia Endocrinologie), — Lot 6 – Trusa motor chirurgical osteosinteza oase mici si mijlocii (1 bucata — Sectia Chirurgie plastica), — Lot 7 – Analizator electroliti si consumabile (1 bucata — Sectia Diabet), — Lot 8 – Electrocardiograf (1 bucata — Sectia Diabet), — Lot 9 – Aspirator chirurgical (2 bucati — Sectia Ortopedie), — Lot 10 – Electrocauter bipolar (2 bucati — Sectia Ortopedie), — Lot 11 – Electrocardiograf portabil (1 bucata — Sectia Ortopedie), — Lot 12 – Holter EKG (1 bucata — Sectia Medicina interna), — Lot 13 – Holter TA (1 bucata — Sectia Medicina interna), — Lot 14 – Sistem lampi scialitice (3 bucati — Sectia Obstretica Ginecologie), — Lot 15 – Termostat pentru incalzit sange (3 bucati — Sectia ATI), — Lot 16 – Aparat radiologic mobil pentru nou nascuti (1 bucata — Sectia Neonatologie), — Lot 17 – Sterilizator cu abur pentru sterilizarea biberoanelor (1 bucata — Sectia Neonatologie), — Lot 18 – Aparat foto profesional + accesorii (1 bucata — Sectia Medicina legala), — Lot 19 – Dispenser parafina DP500 (1 bucata — Sectia Medicina legala), — Lot 20 – Omogenizator cu ultrasunete (1 bucata — Sectia Medicina legala), — Lot 21 – Sistem de supraveghere video cu circuit inchis full HD (1 bucata — Sectia Medicina legala), — Lot 22 – Injectomat (12 bucati — Sectia Cardiologie si Sectia ATI), — Lot 23 – Aparat de ventilatie ATI adulti (4 bucati — Sectia ATI), — Lot 24 – Aparat de ventilatie mecanica adulti (1 bucata — Sectia ATI), — Lot 25 – Monitor functii vitale (3 bucati — Sectia Medicina ATI). Numarul zilelor pana la care se pot solicita clarificari, inainte de data limita de depunere a ofertelor/candidaturilor: 15 zile. Autoritatea contractanta va raspunde in mod clar si complet tuturor solicitarilor de clarificari, in a 10 a zi inainte de data limita de depunere a ofertelor. 1 894 774,00 2018/S 080-180106 25 04 2018 30708 Cardiotocograf (2 bucăți — Secția Obstretică Ginecologie) — Lot 1 04 09 2018 7 Liamed S.R.L. Bulevardul Griviței nr. A8 Brașov 500182 licitatie@liamed.ro larisa.dartu@liamed.ro office@liamed.ro daniela.neagoe@liamed.ro +40 268327490 www.liamed.ro +40 268327490 20 000,00 8 008,00 30717 Holter TA (1 bucată — Secția Medicină internă) — Lot 13 04 09 2018 6 Biotechnics Implant S.R.L. Str. C.F. Robescu nr. 12, sector 3 București 030217 office@biotechnics.ro +40 722463215 5 000,00 1 600,00 30718 Sistem lămpi scialitice (3 bucăți — Secția Obstretică Ginecologie) — Lot 14 04 09 2018 1 Euroexpand Impex S.R.L. Str. Hațegului nr. 46 Oradea 410236 office@medicalexpand.ro 207 000,00 117 000,00 30719 Aparat radiologic mobil pentru nou născuți (1 bucată — Secția Neonatologie) — Lot 16 04 09 2018 1 Medist Imaging & P.O.C. S.R.L. Str. Ion Urdăreanu nr. 34, sector 5 București 050688 imaging@medist.ro +40 214107026 www.medist-imaging.ro +40 214107028 448 500,00 448 000,00 30722 Injectomat (12 bucăți — Secția Cardiologie și Secția ATI) — Lot 22 04 09 2018 7 Fresenius Kabi România S.R.L. Str. Fânarului nr. 2A Brașov 500464 anca.vasilache@fresenius-kabi.com denise.bunea@fresenius-kabi.com livia.ivascu@fresenius-kabi.com andreea.dregan@fresenius-kabi.com roxana.leahu@fresenius-kabi.com +40 268406260 www.fresenius-kabi.ro +40 268406263 50 400,00 43 200,00 30721 Dispenser parafină DP500 (1 bucată — Secția Medicină legală) — Lot 19 04 09 2018 1 Tunic Prod S.R.L. Aleea Mizil nr. 62, sector 3 București 030614 gina.stan@tunic.ro +40 213221516 www.tunic.ro +40 213225672 20 000,00 20 000,00 30716 Electrocardiograf portabil (1 bucată — Secția Ortopedie) — Lot 11 04 09 2018 9 Rombiomedica S.R.L. Str. Paris nr. 49, sector 1 București 011815 lorena.gogot@rombiomedica.com marius.strainu@rombiomedica.com +40 212302390 www.rombiomedica.com +40 212302391 10 500,00 3 565,00 50 S.C. Biosintex S.R.L. — subcontractant al S.C. Rombiomedica S.R.L. Partea de montare, punere în funcțiune, instruire și service în garanție, în proporție de 50 %. 30723 Aparat de ventilație ATI adulți (4 bucăți — Secția ATI) — Lot 23 04 09 2018 1 Drager Medical România S.R.L. Str. Daniel Danielopoiu nr. 42A, sector 1 București 014125 Loredana.Toderas@draeger.com Diana.Muller@draeger.com +40 212331060 +40 212331130 224 000,00 184 000,00 30706 Aparat de ventilație mecanică adulți (1 bucată — Secția ATI) — Lot 24 04 09 2018 1 Gemedica S.R.L. Str. Opanez nr. 90–92, sector 2 București 020108 251 000,00 239 000,00 30724 Monitor funcții vitale (3 bucăți — Secția Medicină ATI) — Lot 25 04 09 2018 6 Medicomplex S.R.L. Calea 13 Septembrie nr. 118, bl. 60–62, etaj 3, ap. 6, sectorul 5 București 050717 medicomplex@yahoo.com +40 723391946 +40 4101503 31 431,00 31 431,00 30707 Ecograf Doppler 3D (1 bucată — Secția Cardiologie) — Lot 2 04 09 2018 2 Supermedical S.R.L. Str. Preciziei nr. 24 Craiova 200747 office@supermedical.ro 378 000,00 363 000,00 30709 Ecograf portabil (1 bucată — Secția Neurologie) — Lot 3 04 09 2018 1 Healthtim S.R.L. Str. Albinelor Timișoara 300244 office@healthtim.ro +40 356463788 +40 356463789 135 600,00 132 800,00 30720 Sterilizator cu abur pentru sterilizarea biberoanelor (1 bucată — Secția Neonatologie) — Lot 17 04 09 2018 3 Sterisystems S.R.L. Bulevardul Unirii nr. 59, bl. F2, sc. 2, et. 6, ap. 52, sector 3 București 030828 Madalina.Bacanu@rafi.ro Aurelia.Aurica@rafi.ro Anisoara.Vlad@rafi.ro +40 213227121 +40 213227123 121 000,00 120 000,00 30710 Monitor funcții vitale (1 bucată — Neurologie) — Lot 4 04 09 2018 9 Brokmed S.R.L. Str. Kos Karoly nr. 96 Sfântu Gheorghe 520085 office@brokmed.ro +40 367407843 www.brokmedshop.ro +40 367407844 5 040,00 3 280,00 30711 Ecograf staționar (1 bucată — Secția Endocrinologie) — Lot 5 04 09 2018 4 Medist Imaging & P.O.C. S.R.L. Str. Ion Urdăreanu nr. 34, sector 5 București 050688 imaging@medist.ro +40 214107026 www.medist-imaging.ro +40 214107028 138 000,00 107 400,00 30712 Trusă motor chirurgical osteosinteză oase mici și mijlocii (1 bucată — Secția Chirurgie plastică) — Lot 6 04 09 2018 3 Rafi International General Commerce S.R.L. Str. Foișorului nr. 9, bl. F6c, sc. 1, ap. 3, sector 3 București 031173 Madalina.Bacanu@rafi.ro Aurelia.Aurica@rafi.ro Anisoara.Vlad@rafi.ro +40 213227121 www.rafi.ro +40 213227123 58 000,00 52 118,00 30713 Analizator electroliți și consumabile (1 bucată — Secția Diabet) — Lot 7 04 09 2018 2 X-Lab Solutions S.R.L. Calea Mănăștur nr. 70, bloc E5, sc. 2, ap. 30 Cluj-Napoca 041432 office@xlab.ro seap@xlab.ro +40 213327631 www.xlab.ro +40 213327633 10 440,00 10 390,00 30714 Electrocardiograf (1 bucată — Secția Diabet) — Lot 8 04 09 2018 11 Luan Vision S.R.L. Piața București nr. 6, bl. 6, sc. B, et. 1, ap. 25 Oradea 410175 mariana.broizan@lvsmedica.ro mariana.elena@lvsmedica.ro lucian1979@gmail.com +40 726185886 +40 359411523 7 500,00 2 650,00 30715 Aspirator chirurgical (2 bucăți — Secția Ortopedie) — Lot 9 04 09 2018 5 Liamed S.R.L. Bulevardul Griviței nr. A8 Brașov 500182 licitatie@liamed.ro larisa.dartu@liamed.ro office@liamed.ro daniela.neagoe@liamed.ro +40 268327490 www.liamed.ro +40 268327490 9 000,00 7 332,00 1) Daca se depun doua sau mai multe oferte cu pret egal si totodata cel mai mic pret din totalul ofertelor depuse, departajarea ofertelor (cu pret egal) se va face prin depunerea de catre ofertantii aflati in aceasta situatie a unor noi propuneri financiare; 2) Documentul Unic de Achizitii European se va putea accesa, in vederea completarii de catre operatorii economici interesati, la adresa: https://ec.europa.eu/growth/toolsdatabases/ espd/filter 3) Pentru vizualizarea documentatiei de atribuire incarcate in SEAP, operatorii economici trebuie sa aiba un program necesar vizualizarii fisierelor semnate electronic (site-urile furnizorilor de semnatura electronica); 4) Avand in vedere faptul ca procedura este impartita pe mai multe loturi, respectiv produsele necesar a fi achizitionate au termene de livrare diferite, s-a mentionat la sectiunea II.3). Durata contractului din Fisa de date a procedurii un termen maximal, iar la fiecare lot in parte (detaliate in Anexa B la Fisa de date a procedurii) s-a prevazut durata de livrare, in conformitate cu caietele de sarcini si specificul fiecarui produs; 5) Reguli de comunicare si transmitere a datelor: Solicitarile de clarificari referitoare la prezenta documentatie de atribuire se vor adresa in mod exclusiv in SEAP, la sectiunea „Intrebari” din cadrul procedurii de atribuire derulate prin mijloace electronice, iar raspunsurile la acestea vor fi publicate in SEAP, atat la sectiunea „Intrebari”, cat si la sectiunea Documentatie, clarificari si decizii din cadrul anuntului de participare, autoritatea contractanta urmand sa nu dea curs solicitarilor adresate prin alta modalitate de comunicare decat cea stabilita in conformitate cu prevederile art. 64 alin. (1) din Legea nr. 98/2016 privind achizitiile publice. Pentru transmiterea solicitarilor de clarificari privind documentatia de atribuire, operatorii economici se vor inregistra in SEAP (www.elicitatie.ro) ca operator economic si ca participant la procedura de atribuire. Pentru comunicarile ulterioare depunerii ofertelor: Comisia de evaluare va transmite solicitarile de clarificare in legatura cu oferta prin utilizarea facilitatilor tehnice disponibile in SEAP (sectiunea „Intrebari”). Operatorii economici vor transmite raspunsurile la clarificari si eventualele documente solicitate pe parcursul evaluarii ofertelor prin intermediul SEAP (sectiunea „Intrebari”), in format electronic, semnate cu semnatura electronica, conform Legii nr. 455/2001; 6) Autoritatea contractanta isi rezerva dreptul ca intre data initierii si data atribuirii contractului sa dispuna anularea procedurii (pentru unul sau mai multe loturi) daca au intervenit situatii care au facut imposibila concretizarea sursei de finantare pentru respectivul/respectivele echipamente. Consiliul Național de Soluționare a Contestațiilor Str. Stavropoleos nr. 6, sector 3 București 030084 office@cnsc.ro +40 213104641 http://www.cnsc.ro +40 213104642 / +40 218900745 Spitalul Clinic Județean de Urgență Arad — Serviciul Achiziții Publice Str. Andreny Karoly nr. 2–4 Arad 310037 scjuarad.bap@gmail.com +40 257211233 www.scjarad.ro +40 257211233 07 09 2018
              Tableau Dashboard Consultant - DeWinter Group - San Francisco, CA      Cache   Translate Page      
    Expertise in visually representing complex databases and data modeling to develop relevantcharts, graphs &amp; tabular data....
    From DeWinter Group - Thu, 13 Sep 2018 00:15:38 GMT - View all San Francisco, CA jobs
              Oilfield Chemicals Industry Key Manufacturers, Analysis and 2023 Forecasts for Global Market      Cache   Translate Page      
    (EMAILWIRE.COM, September 13, 2018 ) “Global Oilfield Chemicals Consumption Market Report 2018-2023” newly adds in Researchformarkets.com database. This report covers leading key company profiles with information such as business overview, regional analysis, consumption, revenue and specification. Oilfield...
              Social Services Worker - State of Wyoming - Sheridan, WY      Cache   Translate Page      
    Computer skills, including word processing and the WYCAPS database. Under close supervision, provides investigative, protective and social service intervention... $19.93 - $24.91 an hour
    From State of Wyoming - Fri, 17 Aug 2018 20:50:40 GMT - View all Sheridan, WY jobs
              Preservation Contractor - Silver State Asset Protection - Buffalo, WY      Cache   Translate Page      
    All of our contractors are given free accounts to our online database and our mobile app. We are looking for Independent Contractors to complete property...
    From Indeed - Tue, 28 Aug 2018 18:47:49 GMT - View all Buffalo, WY jobs
              Program Analyst (APHC) - Knowesis Inc. - Aberdeen, MD      Cache   Translate Page      
    Prepares and presents programmatic budget reports for investigators and management. Maintains the database for the tracking and distribution of all task area...
    From Knowesis Inc. - Thu, 17 May 2018 22:01:27 GMT - View all Aberdeen, MD jobs
              Hubspot tutoring      Cache   Translate Page      
    Hello, I need help building our first (free) hubspot database. We have contacts and companies in excel form and would like to move them to Hubspot correctly. We also have minor customization, such as changing the default lifecycle stages... (Budget: $15 - $25 USD, Jobs: App Developer, CRM, Database Administration)
              Oracle database administrator      Cache   Translate Page      
    I need some projects/contracts to work as an Oracle DBA. Please find my Linkedin profile. http://linkedin.com/in/gujjula-paramesh-b97167143 (Budget: ₹100 - ₹400 INR, Jobs: Database Administration, Oracle)
              Blog Installation and database development      Cache   Translate Page      
    Please do chat me up for more details about the job (Budget: $750 - $1500 USD, Jobs: Database Development, Database Programming, HTML, Website Design, WordPress)
              Customer Service Representative      Cache   Translate Page      
    AR-Russellville, Job Description Staffmark currently has an opening for a talented individual to fill the role of Customer Service Representative . As a Customer Service Representative, you will be responsible for resolving product or service problems, opening customer accounts, maintaining customer records, answering inquiries, fulfilling customer requests, maintaining call center database, taking and entering or
              How does "Server Generated" differ from NULL?      Cache   Translate Page      
    IDS 12.10.FC3
    Solaris 10 1/13

    A table in our production system has a SERIAL column. So, I think it can't be
    NULL. But, if I execute the following: "SELECT * FROM <table> WHERE
    <serial_col> IS NULL, I get a couple thousand rows in which that column seems,
    effectively, to be NULL.

    If I use dbaccess, it displays as just white space.

    If I use Server Studio, it displays as greyed out "Server Generated" (not
    greyed out "NULL", as one usually sees in Server Studio). And, it appears to
    function as if it were NULL.

    How does "Server Generated" happen, and how is it different from NULL?

    BTW, it breaks dbimport. I can dbexport that database (including that table),
    but dbimport blows up when it encounters it during the import process.

    DG




    *******************************************************************************

    To post a response via email (IIUG members only):

    1. Address it to ids@iiug.org
    2. Include the bracketed message number in the subject line: [41310]

    *******************************************************************************

              RESOLICITATION - UH 72 MAINTENANCE      Cache   Translate Page      
    This is a combined synopsis/solicitation for commercial items prepared IAW the format
    in FAR Subpart 12.6, as supplemented with additional information included in this
    notice. This announcement constitutes the only solicitation. Proposals are being
    requested and a written solicitation WILL NOT be issued. The solicitation
    W912LA18T0088 is issued as a Request for Quote. The solicitation document,
    incorporated provisions and clauses are those in effect through Federal Acquisition
    Circular 2005-101, effective 20 July 2018. This procurement is issued under NAICS
    code 488190. The small business size standard for this NAICS code is $15 Million. The
    California National Guard is soliciting proposals for Aircraft Mechanic services in
    Stockton, CA 95206.

     


    The Contractor will provide all personnel, equipment, supplies, transportation, tools,
    materials, supervision, and other items and non-personal services necessary to perform
    aviation maintenance services in accordance with this solicitation and the attached
    Performance Work Statement. This Request for Proposal is set aside 100% for small
    business. In accordance with Federal Acquisition Regulation (FAR) Subpart 19.5, any
    award resulting from this solicitation, will be made on a competitive basis from among
    all responsible business concerns submitting offers. This Request for Quote will result
    in a single, Firm-Fixed Priced contract. Offers will be evaluated as Lowest Price
    Technically Acceptable. Simplified Acquisition Procedures will be utilized.


     


    The requirement is for the following: The contractor shall staff one Contact Team to
    provide scheduled maintenance support to the Army National Guard UH-72 fleet
    located at Stockton Metropolitan Airport. The team will consist of 1 each A&P
    Mechanic III, and 1 each A&P Mechanic II with a maximum of 900 man hours per
    position. Department of Labor Wage rates apply.


     


     


    The following FAR provisions/clauses are incorporated:


    52.203-19, Prohibition on Contracting with Entities that Require Certain Internal
    Confidentiality Agreements or Statements;


    52.204-9, Personal Identity Verification of Contractor Personnel;


    52.204-10, Reporting Executive Compensation and First-Tier Subcontract Awards;


    52.204-16, Commercial and Government Entity Code Reporting;


    52.204-22, Alternative Line Item Proposal;


    52.209-6, Protecting the Government's Interest When Subcontracting with Contractors
    Debarred, Suspended, or Proposed for Debarment;


    52.209-10, Prohibition on Contracting with Inverted Domestic Corporations;


    52.209-11, Representation by Corporations Regarding Delinquent Tax Liability or a
    Felony Conviction under any Federal Law;


    52.209-12, Certification Regarding Tax Matters;


    52.212-1, Instructions to Offerors-Commercial Items;


    52.212-3 ALT I, Offeror Representations and Certifications - Commercial Items;
    52.212-4, Contract Terms and Conditions;


    52.212-5, Contract Terms and Conditions Required to Implement Statutes or Executive
    Orders - Commercial Items



    52.216-4, Notice of Price Evaluation Preference for HUBZone Small Business
    Concerns;


    52.216-31, Time-and-Materials/Labor-Hour Proposal Requirements-Commercial Item
    Acquisitions;


    52.217-8, Option to Extend Services;


    52.219-28, Post-Award Small Business Program Representation;


    52.222-3, Convict Labor;


    52.222-21, Prohibition of Segregated Facilities;


    52.222-26, Equal Opportunity;


    52.222-36, Equal Opportunity for Workers with Disabilities;


    52.222-41, Service Contract Labor Standards;


    52.222-42, Statement of Equivalent Rates for Federal Hires;


    52.222-50, Combating Trafficking in Persons;


    52.222-55, Minimum Wages Under Executive Order 13658;


    52.222-62, Paid Sick Leave Under Executive Order 13706;


    52.223-1, Biobased Product Certification;


    52.223-4, Recovered Material Certification;


    52.223-5, Pollution Prevention and Right to Know Information;


    52.223-17, Affirmative Procurement of EPA-designated Items in Service and
    Construction Contracts;


    52.223-18, Encouraging Contractor Policy to Ban Text Messaging While Driving;


    52.223-20, Aerosols;


    52.223-21, Foams;


    52.225-13, Restrictions on Certain Foreign Purchases;


    52.232-33, Payment by Electronic Funds Transfer-System for Award Management;


    52.232-39, Unenforceability of Unauthorized Obligations


    52.233-3, Protest After Award;


    52.233-4, Applicable Law for Breach of Contract Claim; and 52.252-2; Clauses
    incorporated by Reference;


    52.237-1, Site Visit;


    52.245-1, Government Property;


    52.245-9, Use and Charges;


    52.252-2, Clauses Incorporated by Reference.


     


    The provisions/ clauses at DFARS 252.212-7000, Offeror Representations and
    Certifications - Commercial Items apply to this acquisition and must be fully
    completed and submitted with offer; and 252.212-7001 (Dev), Contract terms and
    Conditions Required to Implement Statutes to Defense Acquisition of Commercial
    Items (Deviation) are applicable to this acquisition. The following DFARS clauses are
    incorporated:


    252.201-7000, Contracting Officer's Representative;


    252.203-7002 ALT A, Requirement to Inform Employees of Whistleblower Rights
    (ALT A);


    252.203-7005, Representation Relating to Compensation of Former DoD Officials;


    252.204-7008, Compliance with Safeguarding Covered Defense Information Controls;


    252.204-7011, Alternative Line Item Structure;



    252.204-7012, Safeguarding Covered Defense Information and Cyber Incident
    Reporting;


    252.204-7015, Notice of Authorized Disclosure of Information for Litigation Support;


    252.211-7007, Reporting of Government Furnished Property;


    252.213-7000, Notice to Prospective suppliers on the Use of Past Performance
    Information retrieval System - Statistical Reporting in Past Performance Evaluations;


    252.223-7008, Prohibition of Hexavalent Chromium;


    252.225-7000, Buy American - Balance of Payments Program Certificate;


    252.225-7031, Secondary Arab Boycott of Israel;


    252.225-7035, Buy American Act-Free Trade Agreements-Balance of Payments
    Program Certificate;


    252.225-7048, Export-Controlled Items;


    252.232-7003, Electronic Submission of Payment Requests;


    252.232-7010, Levies on Contract Payments are also applicable to this acquisition.


    252.237-7010, Prohibition on Interrogation of Detainees by Contractor Personnel;


    252.239-7001, Information Assurance Contractor Training and Certification;


    252.244-7000, Subcontracts for Commercial Items;


    252.245-7001, Tagging, Labeling, and Marking of Government-Furnished Property;


    252.245-7002, Reporting Loss of Government Property;


    252.245-7003, Contractor Property Management System Administration;


    252.245-7004, Reporting, Reutilization, and Disposal;


    252.247-7023 ALT III, Transportation of Supplies by Sea.


     


    Evaluation -- Commercial Items (Oct 2014)


    (a) The Government will award a contract resulting from this solicitation to the
    responsible offeror whose offer conforming to the solicitation will be most advantageous
    to the Government, price and other factors considered. The following factors shall be
    used to evaluate offers:



    [Contracting Officer shall insert the significant evaluation factors, such as


    (i) technical capability of the item offered to meet the Government
    requirement;


    (ii) price;


    (iii) past performance (see FAR 15.304);



    Technical and past performance, when combined, are __________ [Contracting Officer
    state, in accordance with FAR 15.304, the relative importance of all other evaluation
    factors, when combined, when compared to price.]


    (b) Options. The Government will evaluate offers for award purposes by adding the total
    price for all options to the total price for the basic requirement. The Government may
    determine that an offer is unacceptable if the option prices are significantly unbalanced.
    Evaluation of options shall not obligate the Government to exercise the option(s).


    (c) A written notice of award or acceptance of an offer, mailed or otherwise furnished to
    the successful offeror within the time for acceptance specified in the offer, shall result in
    a binding contract without further action by either party. Before the offer's specified
    expiration time, the Government may accept an offer (or part of an offer), whether or not
    there are negotiations after its receipt, unless a written notice of withdrawal is received
    before award.


    (End of Provision)


     


    All vendors must be registered in the System for Award Management (SAM) database.
    Proposals are due NLT September 14, 2018 by 10:00 AM by electronic means to
    stella.h.davis.civ@mail.mil.


     


    Set-aside code: Total Small Business Contact: Stella H Davis, Contract Specialist, Phone 8057489164, Email stella.h.davis.civ@mail.mil
              Quality Assurance Inspector II - Sierra Nevada Corporation - Madison, WI      Cache   Translate Page      
    Experience programming, setting up, and running the Coordinate Measuring Machine (CMM). Document information on the computer in various databases and other...
    From Sierra Nevada Corporation - Tue, 17 Jul 2018 23:06:25 GMT - View all Madison, WI jobs
              Senior Quality Assurance Technician - Sierra Nevada Corporation - Madison, WI      Cache   Translate Page      
    Is one of our primary internal auditors. Document information on the computer in various databases and other internal programs....
    From Sierra Nevada Corporation - Wed, 13 Jun 2018 17:07:14 GMT - View all Madison, WI jobs
              Oracle's head of cloud left after butting heads with Larry Ellison, source says (ORCL)      Cache   Translate Page      

    Thomas Kurian

    • A source tells Business Insider that 22-year Oracle veteran Thomas Kurian has butted heads with his boss, founder, CTO and executive chairman Larry Ellison.
    • Bloomberg reports that the issue concerns Oracle's strategy on its cloud computing.
    • Kurian may have been advocating for a cloud strategy that is as risky as it is wise.

    Last week, the Oracle eco-system was surprised to learn that Thomas Kurian, one of Oracle's longest serving executives in charge of Oracle's all-important cloud business, had left for an extended leave of absence.

    Inside Oracle, word is that Kurian's departure was due to butting heads with his boss Larry Ellison, a source tells Business Insider. This person says that Kurian's good-bye was intended as a resignation, although the company says that he has not resigned but is simply "taking some time off. We expect him to return soon," a spokesperson said.

    The disagreement seems to have centered on the direction Oracle should take with its bet-the-company cloud computing business, reports Bloomberg.

    Sources told Bloomberg that Kurian was pushing Ellison to allow more of Oracle's software to run on clouds that compete with Oracle, particularly market leaders Amazon and Microsoft

    If true, this disagreement between the two strategies, and the two men, would not be surprising. Both of them are known for being tough, outspoken and opinionated — characteristics which describe a lot of Oracle's culture.

    A page ripped out of the Microsoft playbook

    If Kurian is pushing Oracle to embrace multiple clouds — even the clouds of its bitter enemies — the strategy would make a lot of sense.

    It's similar to what Microsoft CEO Satya Nadella has done. There was a time when Microsoft's Bill Gates and Steve Ballmer were protectionist about Windows. But, with the rise of cloud computing, Nadella recognized that the world had changed.

    Satya NadellaIt became far less important to push people to use Windows than to ensure that Microsoft's enormous catalog of software, particularly Office, could run on any device. So, Microsoft built out its cloud to serve up Office 365 to run on any device; it made sure that its Windows Server software could run on other clouds; and it embraced competitive software, like Linux, on its own cloud.

    That way, Microsoft makes money when customers run its software on a competitive cloud (they still have to buy the software) or when they run a competitor's software on their own cloud (they have to pay for Microsoft cloud usage).

    Oracle is in a similar quandary but with one key difference: Amazon has become a major threat to Oracle.

    Amazon isn't just trying to get Oracle's customers to bring Oracle software to Amazon Web Services (which they can already do), it's trying to get customers to ditch Oracle's database and use Amazon's database instead. Amazon even built a tool to make it easier to move from an Oracle database to an Amazon one. Microsoft also has its own database and has been a bitter competitor with Oracle for years.

    So Ellison has been building an Oracle cloud that competes with Amazon (and Microsoft) insisting Oracle's cloud is a faster, better way to run the database. If Oracle's customers don't stay within Oracle's own sphere, Oracle could lose them altogether.

    The clock is ticking

    Larry EllisonThe problem is, Oracle's cloud is years behind Amazon's in terms of features. It will take Oracle billions of dollars and several years to catch-up, if it even can because Amazon is adding features at an ever increasing rate, hundreds or more per quarter. Microsoft is widely considered the No. 2 cloud.

    Enterprise customers are choosing their cloud providers now, based on the features they want and need now.

    Oracle may not have years to play catch up. And the person responsible for that catch-up is 22-year Oracle veteran Kurian, and his team. Kurian is the president who heads engineering and product development. About a quarter of the company reports up to him.

    There have been signs that Oracle's cloud ambitions are not growing as well as the company wants, too, putting Kurian on the hot seat. Although, to be fair, Oracle is doing a good job in getting many of its customers to sign up for the certain parts of its cloud. They like the cloud versions of its HR, marketing and financial software (similar to how Microsoft moved people from MS Office to Office 365). 

    Should Ellison allow more of that software to run on competitors' clouds? And should it partner with its rivals (assuming such partnerships were an option) to run their software on its own cloud?

    Probably yes. Other would-be Amazon competitors have either been crushed (Rackspace) or forced to eat crow and partner up (VMware). Once VMware got past the bitter taste, its partnership with Amazon has proved fruitful, filling a need with enterprise customers who want their datacenters to work better with the Amazon cloud (and making Amazon more of a beast, in the process).

    But there's no question it's risky, and Ellison certainly wouldn't be crazy for being wary.

    Oracle declined comment.

    SEE ALSO: Read the email the Oracle executive once rumored to be the next CEO emailed the company to announce his 'extended' leave of absence

    SEE ALSO: Elon Musk works so many hours at Tesla, employees are constantly finding him asleep under tables and desks

    Join the conversation about this story »

    NOW WATCH: An aerospace company reintroduced its precision helicopter with two crossing motors


              Client Services Specialist - Colliers International - Seattle, WA      Cache   Translate Page      
    You have experience working with a database such as CoStar or Loopnet. BE the expert....
    From Colliers International - Sat, 14 Jul 2018 04:32:16 GMT - View all Seattle, WA jobs
              Commercial Appraisal Reviewer - Union Bank & Trust - Richmond, VA      Cache   Translate Page      
    Previous experience or working knowledge of databases like Costar, MLS, Loopnet.com preferred. This position is responsible for ordering and reviewing...
    From Union Bank & Trust - Mon, 13 Aug 2018 23:37:39 GMT - View all Richmond, VA jobs
              Senior Commercial Appraisal Reviewer - Union Bank & Trust - Richmond, VA      Cache   Translate Page      
    Previous experience or working knowledge of databases like Costar, MLS, Loopnet.com preferred. This position is responsible for ordering and reviewing...
    From Union Bank & Trust - Mon, 13 Aug 2018 23:37:39 GMT - View all Richmond, VA jobs
              piecesphp/database      Cache   Translate Page      
    Clases que interactuan con bases de datos.
              MS SQL Server Database Admin Sr Advisor - PS3942      Cache   Translate Page      
    VA-Norfolk, Description Your Talent. Our Vision. At Anthem, Inc., it's a powerful combination, and the foundation upon which we're creating greater access to care for our members, greater value for our customers, and greater health for our communities. Join us and together we will drive the future of health care. This is an exceptional opportunity to do innovative work that means more to you and those we serv
              Database Administrator - L3 - Promaxis Systems Inc. - Ottawa, ON      Cache   Translate Page      
    Promaxis is located in Ottawa, Ontario, Canada. To be considered for similar jobs, fill out a general application on the Promaxis careers page....
    From Promaxis Systems Inc. - Wed, 05 Sep 2018 06:28:44 GMT - View all Ottawa, ON jobs
              Rush Limbaugh shares fake story that sharks are flying around in Hurricane Florence      Cache   Translate Page      

    RUSH LIMBAUGH (HOST): Latest Hurricane Florence update: New reports from NOAA aircraft show sharks have been lifted into the hurricane. So those of you in the target path in North Carolina, South Carolina: In addition to the pig manure, in addition to the slop, in addition to the floods, in addition to the cars rolling around on the waters in front of your house, in addition to the mudslides and the landslides, now you might end up with a shark in your front yard. I’m telling you right -- you think I’m making this up? This appeared somewhere. “Florence now contains sharks.” I’m telling you. You want to tell me this story is not true, that this is the one thing I say -- let me just find the headline here, put it at the bottom of the stack. “Hurricane risks include toxic sludge and lagoons of pig manure.”

    This is, I’m telling you -- they’re getting ready to call this “Donald Trump’s Katrina.” How many times did they -- after George [W.] Bush had his Katrina, how many other Katrinas did Bush have for the remainder of his term? They’re setting this up. Now, there was a story about pig manure and slop during Katrina, and the state of Louisiana had to debunk it. There were not oozes and gobbs of pig manure floating around in floodwaters in Louisiana after Hurricane Katrina. They had to debunk the myth. ... These are predictable things that the media has in their Nexis database. Hurricane hits, you go to the Nexis file, the Nexis database, and you look for stories that might look good to run and you run them. Just repeat them.

    “Florence now contains” -- so sharks are being lifted out of the Atlantic Ocean and dumped into the storm because it’s so strong it’s sucking them in there. And then they’re going to be in the waters. Of course the only water that might contain sharks would be storm surge. It isn’t going to be raining sharks. And that’s the predominant water source in a hurricane is rainfall.


              Application Developer C-Store      Cache   Translate Page      
    Job description RoleManage &amp responsible for the entire system including trouble shooting and maintaining the system at the optimum level in ensuring smooth business operations and work closely with Software Vendor Supplier Database Management for POS and BOS SystemRequirementsBachelor in IT Computer Studies4 years of experience in Retail
              Water Proof Coatings Market 2018 Global Analysis By Key Players – PPG Industries, Valspar, BASF, Flosil Chemicals, Dampney      Cache   Translate Page      
    Water Proof Coatings Market 2018 Global Analysis By Key Players – PPG Industries, Valspar, BASF, Flosil Chemicals, Dampney Water Proof Coatings Market 2018 Wiseguyreports.Com adds “Water Proof Coatings Market –Market Demand, Growth, Opportunities, Analysis of Top Key Players and Forecast to 2025” To Its Research Database. Report Details: This report provides in depth study of “Water Proof Coatings Market” using SWOT

              Data Entry Operator      Cache   Translate Page      
    AZ-Phoenix, A Government Agency in Downtown Phoenix is looking for hard working and dedicated Data Entry Operator to join their growing team. This position is a long term temporary position, working hours Monday - Friday 8:00 am - 5:00 pm. The salary is $10.50 an hour Responsibilities: Entering information into computer databases for effective record keeping. Organizing files and collecting data to be entered
              Global Chemical Software Market Promising Business Growth Strategies and Scope 2018 to 2025: ANSYS, Frontline Data Solutions, RURO      Cache   Translate Page      

    Brooklyn, NY -- (SBWIRE) -- 09/12/2018 -- Qyresearchreports include new market research report Global Chemical Software Market Size, Status and Forecast 2025 to its huge collection of research reports.

    This report studies the global Chemical Software market size, industry status and forecast, competition landscape and growth opportunity. This research report categorizes the global Chemical Software market by companies, region, type and end-use industry.

    Chemical software is used in the chemical industry for various purposes, such as chemical engineering, chemical mixing, building database, inventory management, International Standards of Organization (ISO) management, uncertainty analysis, practical tracking, visualization software, enterprise resource planning (ERP), and others.

    Download Free Sample Report With TOC: https://www.qyresearchreports.com/sample/sample.php?rep_id=1865324&type=S

    This report focuses on the global top players, covered
    ANSYS
    Frontline Data Solutions
    RURO
    SFS Chemical Safety

    Market segment by Regions/Countries, this report covers
    United States
    Europe
    China
    Japan
    Southeast Asia
    India

    Market segment by Type, the product can be split into
    Chemical process simulation
    ISO management
    Inventory management

    Market segment by Application, split into
    Large Company
    Medium Company
    Small Company

    Get more information from Table of Content: https://www.qyresearchreports.com/report/global-chemical-software-market-size-status-and-forecast-2025.htm/toc

    The study objectives of this report are:
    To study and forecast the market size of Chemical Software in global market.
    To analyze the global key players, SWOT analysis, value and global market share for top players.
    To define, describe and forecast the market by type, end use and region.
    To analyze and compare the market status and forecast between China and major regions, namely, United States, Europe, China, Japan, Southeast Asia, India and Rest of World.
    To analyze the global key regions market potential and advantage, opportunity and challenge, restraints and risks.
    To identify significant trends and factors driving or inhibiting the market growth.
    To analyze the opportunities in the market for stakeholders by identifying the high growth segments.
    To strategically analyze each submarket with respect to individual growth trend and their contribution to the market
    To analyze competitive developments such as expansions, agreements, new product launches, and acquisitions in the market
    To strategically profile the key players and comprehensively analyze their growth strategies.

    In this study, the years considered to estimate the market size of Chemical Software are as follows:
    History Year: 2013-2017
    Base Year: 2017
    Estimated Year: 2018
    Forecast Year 2018 to 2025

    For the data information by region, company, type and application, 2017 is considered as the base year. Whenever data information was unavailable for the base year, the prior year has been considered.

    To Browse a Complete Report Visit @ https://www.qyresearchreports.com/report/global-chemical-software-market-size-status-and-forecast-2025.htm

    Key Stakeholders
    Chemical Software Manufacturers
    Chemical Software Distributors/Traders/Wholesalers
    Chemical Software Subcomponent Manufacturers
    Industry Association
    Downstream Vendors

    About QYResearchReports.com
    QYResearchReports.com delivers the latest strategic market intelligence to build a successful business footprint in China. Our syndicated and customized research reports provide companies with vital background information of the market and in-depth analysis on the Chinese trade and investment framework, which directly affects their business operations. Reports from QYResearchReports.com feature valuable recommendations on how to navigate in the extremely unpredictable yet highly attractive Chinese market.

    Contact Us:

    Brooklyn, NY 11230
    United States
    Toll Free: 866-997-4948 (USA-CANADA)
    Tel: +1-518-621-2074
    Follow Us on LinkedIn: https://www.linkedin.com/company/qyresearchreports-com
    Web: https://www.qyresearchreports.com
    Email: sales@qyresearchreports.com
    Blog: https://reportanalysis.blogspot.in

    For more information on this press release visit: http://www.sbwire.com/press-releases/global-chemical-software-market-promising-business-growth-strategies-and-scope-2018-to-2025-ansys-frontline-data-solutions-ruro-1047216.htm

    Media Relations Contact

    Ivan Gary
    Manager
    qyresearchreports
    Telephone: 866-997-4948
    Email: Click to Email Ivan Gary
    Web: https://www.qyresearchreports.com/report/global-chemical-software-market-size-status-and-forecast-2025.htm

    #source%3Dgooglier%2Ecom#https%3A%2F%2Fgooglier%2Ecom%2Fpage%2F%2F10000


              Catalog Management Software Market Research 2018: Industry Growth Scenario, Application, Technology and Global Forecast 2025      Cache   Translate Page      
    (EMAILWIRE.COM, September 13, 2018 ) ResearchForMarkets recently added “Catalog Management Software Market by manufacturers, regions, type and application, forecast to 2025” in his database. This research report focus on complete assessment of market and contains future trend, growth factors, attentive...
              Construction Administrative Assistant      Cache   Translate Page      
    WA-Auburn, Job Description My established client in Auburn, WA is seeking an Admin Asst/Coordinator with Contruction background for a direct hire opportunity. This person will be responsible to: Maintain the company's estimating database Coordinate dissemination of all required information to estimators Assist the Sales staff in the preparation of estimates/proposals Assist the Sales Manager in setting up co
              States Have Paid $2.2 Billion to Exonerees      Cache   Translate Page      
    The National Registry of Exonerations (NRE) will soon publish the second part of a study of all known false convictions in the U.S. since 1989. The 2,265 exonerees in the database served a combined 20,080 years behind bars.
              Syndicate Coodinator      Cache   Translate Page      
    Requisition ID: 33602 Join the Global Community of Scotiabankers to help customers become better off. Day-to-day responsibilities New issue administrator responsible for deal documentation, deal entry, and deal co-ordination. Updating Microsoft excel and Access based models, and emailing marketing material internally/ externally. Assist in the dissemination of the desk’s marketing initiatives (internal and external). Setting up daily conference calls Act as ‘investor - issuer relations’ coordinator for roadshows and presentations, from concept stage to execution. Producing PowerPoint presentations, and taking initiative in designing slides and graphics to best represent data Design queries to extract info and derive conclusions from raw data Maintain databases and scheduled tasks Perform daily account reconciliations and other administrative tasks tied to accounting and settlement Prepare
              Database Administrator - ITS - Careers | West Virginia University - Morgantown, WV      Cache   Translate Page      
    Experience installing, configuring, and managing application, middleware, and web servers. Research new patches, upgrades, and configuration changes for...
    From West Virginia University - Wed, 29 Aug 2018 22:05:32 GMT - View all Morgantown, WV jobs
              Systems Engineer I - Synergy BIS - Kearneysville, WV      Cache   Translate Page      
    Must possess experience of system engineering in one or more areas including telecommunications concepts, computer languages, operating systems, database/DBMS...
    From Synergy BIS - Tue, 03 Jul 2018 02:27:25 GMT - View all Kearneysville, WV jobs
              CAD Principal Designer - SSI (611376)      Cache   Translate Page      
    IN-Warsaw, Job Summary Under the direction of a project leader, responsible for creating and revising detail drawings using 3-Dimensional modeling tools. Responsible for the integrity of the computer-graphic product and process model databases. These models must be suitable for internal concurrent manufacture, CMM data extraction and vendor translation. Participates in developing and implementing departmenta
              Software Engineer (ASP.NET / C# / SQL Server) - Aries Systems Corporation - North Andover, MA      Cache   Translate Page      
    Proficiency with SQL Server and relational database design, ASP.NET (as well as Classic ASP), C#, Web Forms, JavaScript, CSS, XML, and general OOP concepts....
    From Dice - Thu, 16 Aug 2018 05:21:20 GMT - View all North Andover, MA jobs
              Duplicator Pro v3.7.6 - WordPress Site Migration & BackUp      Cache   Translate Page      
    Create a backup of your WordPress files and database. Duplicate and move an entire site from one location to another in a few steps. Create a full snapshot of your site at any point in time.

    Demo: https://snapcreek.com/
              România-Arad: Servicii de curăţenie şi igienizare      Cache   Translate Page      
    Spital Clinic Județean de Urgență Arad Romania Str. Spitalului nr. 1 Arad 310031 Slagean Petcov +40 257211233 scjuarad.bap@gmail.com +40 257211233 www.scjarad.ro www.e-licitatie.ro www.e-licitatie.ro Spital Servicii de curățenie și dezinfecție 3519879_2018_PAAPD1029479 Se retine ca se are in vedere asigurarea curateniei si dezinfectiei in toate spatiile apartinand Spitalului Clinic Judetean de Urgenta Arad, in conditiile si cu respectarea exigentelor minimale precizate in Caietul de sarcini cu Anexa 1 si cu respectarea si aplicarea intocmai de catre prestator a Procedurilor operationale (cod PO — 005 SPIAAM, Protocol cod 751-38-15 SPIAAM, Procedura operationala Cod PO-001 SPIAAM si Procedura operationala Cod PO-002 SPIAAM), a Ordinului 1225/2003 si Ordinului 1226/2012 privind curatarea si dezinfectia suprafetelor aprobate la nivelul Spitalului Clinic Judetean de Urgenta Arad. Suprafetele de curatat si dezinfectat in cuantum de 87 166,53 mp/luna, necesitati precizate detaliat in Anexa 1 a Caietului de sarcini, prin derularea procedurii de fata urmarindu-se atribuirea unui acord-cadru cu durata de 24 de luni. Numarul zilelor pana la care se pot solicita clarificari, inainte de data limita de depunere a ofertelor/candidaturi... detalii pe www.e-licitatie.ro 10459983.6 Spitalul Clinic Județean de Urgenta Arad cu toate locatiile apartinatoare. Prin achizitionarea serviciilor precizate mai sus se da curs solicitarilor/necesitatilor identificate in cadrul Spitalului Clinic Judetean de Urgenta Arad de asigurare a serviciilor de curatenie si dezinfectie, necesitati precizate si in Referatul de necesitate nr. 18616/28.6.2018 si in Caietul de sarcini nr. 18616/28.6.2018 cu Anexele 1 si 2. Componenta tehnica. 20 80 24 Principalele modalitati de finantare si plata si/sau trimitere la dispozitiile relevante: Asigurari. 1) Cerinta: Ofertantii participanti la procedura vor face dovada neincadrarii in situatiile prevazute la art. 59, 164, 165 si 167 din Legea nr. 98/2016 privind achizitiile publice. 1) Mod de dovedire: Se va completa Documentul unic de achizitie European (DUAE), cu privire la neincadrarea in situatiile de excludere prevazute de legislatia nationala. Nota 1: Se solicita atat ofertantului asociat, cat si tertului sustinator sau subcontractantului. Nota 2: Daca tertul/tertii, subcontractantul/subcontractantii se incadreaza in unul dintre motivele de excludere prevazute la art. 59, 164, 165 si 167, autoritatea contractanta solicita, o singura data, ca operatorul economic sa inlocuiasca tertul/tertii sustinator/sustinatori, respectiv subcontractantul/subcontractantii. Avand in vedere prevederile art. 196 din Legea 98/2016, la solicitarea autoritatii contractante, se vor completa si atasa de catre operatorii clasati pe primele 2 locuri in clasamentul intermediar intocmit la finalizarea evaluarii ofertelor urmatoarele: (a) certificatul de cazier judiciar al operatorului economic si al membrilor organului de administrare, de conducere sau de supraveghere al acestuia si/sau al celor care au putere de reprezentare, de decizie sau de control in cadrul acestuia, asa cum rezulta din certificatul constatator emis de ORC/actul constitutiv sau, dupa caz, documente prin care se demonstreaza faptul ca operatorul economic poate beneficia de derogarile prevazute de art. 166 alin. (2), art. 167 alin. (2), art. 171 din Legea nr. 98/2016 privind achizitiile publice. In cazul unei asocieri, certificatul va fi prezentat de fiecare asociat in parte, (b) documente eliberate de autoritatile legale competente, care sa ateste faptul ca ofertantul si-a indeplinit obligatiile de plata a impozitelor si taxelor catre bugetul general consolidat, precum si taxele locale, in conformitate cu prevederile legale in vigoare in Romania sau in tara in care este stabilit, in original, copie legalizata sau copie lizibila cu mentiunea conform cu originalul (scanate), pentru persoanele juridice romane, respectiv original sau copie conform cu originalul si traduse in limba romana (scanate), in cazul persoanelor juridice straine. In cazul unei asocieri, certificatele vor fi prezentate de fiecare asociat in parte. Nota: Din Certificatul de atestare fiscala/Certificatul privind plata impozitelor si taxelor locale trebuie sa reiasa ca ofertantul nu are datorii restante la momentul prezentarii acestora, ca urmare a solicitarii transmise de catre autoritatea contractanta in conformitate cu prevederile art. 196 din Legea nr. 98/2016. Dovada achitarii taxelor se face prin prezentarea formularelor tip emise de organismele competente privind indeplinirea obligatiilor de plata. Ofertele depuse de operatorii economici care figureaza cu datorii neachitate catre bugetul de stat sau bugetul local si care nu au obtinut inlesniri/reesalonari la plata acestor datorii (cu luarea in considerare a scadentei de plata a acestora) vor fi descalificate/respinse; 2) Cerinta: Ofertantii participanti la procedura vor face dovada neincadrarii in situatiile prevazute la art. 60 din Legea nr. 98/2016 privind achizitiile publice. In vederea dovedirii neincadrarii ofertantului in situatia unui conflict de interese, se comunica in cele ce urmeaza lista persoanelor cu functie de decizie din cadrul autoritatii contractante, in ceea ce priveste organizarea, derularea si finalizarea procedurii de atribuire: 1) Diana Lavinia Lina — Presedintele Comisiei de evaluare, fara drept de vot; 2) Emilia Otilia Popa — Membru al Comisiei de evaluare; 3) Lenuta Timis — Membru al Comisiei de evaluare; 4) Iuliana Bologan — Membru Comisa de evaluare; 5) Eugenia Manea — Membru supleant al Comisiei de evaluare; 6) Monica Adriana Mate — Membru Comisia de evaluare; 7) Magdalena Gaja Lacrimioara — Membru Comisie de evaluare; 8) Maria Ciorgan — Membru Comisia de evaluare; 9) Oana Popa — Membru supleant al Comisiei de evaluare; 10) ... detalii pe www.e-licitatie.ro 1) Ofertantii vor face dovada ca, in ultimii 3 ani, au inregistrat o cifra de afaceri medie globala care este cel putin dubla in raport cu valoarea estimata a celui mai mare contract subsecvent, respectiv 10 459 983,60 RON fara TVA; 2) In cazul in care ofertantii participanti la procedura invoca sustinerea financiara, tertul sustinator nu trebuie sa se incadreze in prevederile art. 59, 60, 164, 165 si art. 167 din Legea nr. 98/2016. Nota: Daca tertul/tertii nu indeplineste/indeplinesc criteriile relevante privind capacitatea sau se constata o situatie de excludere a tertului, autoritatea contractanta solicita, o singura data, ca operatorul economic sa inlocuiasca tertul/tertii sustinator/sustinatori. 1) Se va completa si se va depune Documentul unic de achizitie European (DUAE), cu privire la cifra de afaceri medie globala pe ultimii 3 ani. Avand in vedere prevederile art. 196 din Legea 98/2016, la solicitarea autoritatii contractante, autoritatea contractanta va solicita operatorilor economici clasati pe primele doua locuri in clasamentul intermediar, ca urmare a finalizarii evaluarii, prezentarea de catre ofertanti a documentelor necesare care sa certifice ca acestia au inregistrat in ultimii trei ani cifra medie de afaceri declarata in DUAE. Aceste documente pot fi: bilanturi contabile sau extrase de bilant inregistrate la organele competente, raport de audit (daca este aplicabil), iar daca din motive obiective parte din acestea nu sunt disponibile, se vor prezenta alte documente edificatoare din care sa rezulte cifra de afaceri aferenta fiecarui an. Se va atasa, odata cu DUAE, Formularul nr. 9 — în original (scanat), impreuna cu documentele anexe la angajament, transmise acestora de catre tertul sustinator, din care rezulta modul efectiv in care se va materializa sustinerea acestuia. Documentele justificative (certificate/documente care probeaza cele asumate in angajamente) vor fi solicitate ofertantilor clasati pe primele 2 locuri in clasamentul intermediar, dupa aplicarea criteriului de atribuire. 5) Ofertanții vor prezenta informații referitoare la personalul/organismul tehnic de specialitate de care dispun sau al cărui angajament de participare a fost obținut, inclusiv supraveghetori, conform cerintelor din Caietul de sarcini, cap. VI; 6) Se va depune de catre ofertanti Lista cu dotarile tehnice specifice prestarii serviciilor, utilaje si echipamente, din care sa reiasa capacitatea ofertantului de a indeplini contractul; 3) In cazul in care mai multi operatori economici participa in comun la procedura de atribuire, indeplinirea criteriilor privind capacitatea tehnica si profesionala se demonstreaza prin luarea in considerare a resurselor tuturor membrilor grupului, acestia urmand sa raspunda in mod solidar pentru executarea contractului de achizitie publica. Asociatii vor trebui sa depuna imputernicirea liderului asociatiei de a reprezenta asociatia la procedura de atribuire, din care sa rezulte cine este liderul asociatiei (consortiului) si partea din contract pe care o va indeplini fiecare asociat (valoric si procent din valoarea contractului), inclusiv modul de utilizare a resurselor tehnice si umane, precum si obligativitatea mentinerii asocierii pe toata durata executiei contractului. Daca acordurile de asociere sunt incheiate in alta limba decat limba romana, se va prezenta copia dupa aceste acorduri si traducerea in limba romana dupa acestea; 1) Ofertantii participanti la procedura vor prezenta Lista cu principalele servicii prestate, din care sa rezulte valori, perioade de prestare, beneficiari/contracte, care sa releve faptul ca in ultimii 3 ani au mai prestat servicii similare (unitati spitalicesti); 4) Autoritatea contractanta solicita ofertantului sa precizeze in oferta partea/partile din contract pe care urmeaza sa le subcontracteze si datele de identificare ale subcontractantilor propusi. Autoritatea contractanta va verifica inexistenta unei situatii de excludere prevazuta la art. 164, art. 165 si art. 167 din Legea nr. 98/2016, in legatura cu subcontractantii propusi. Nota 1: In cazul in care este identificata o situatie de excludere, cu aplicarea in mod corespunzator a dispozitiilor art. 174 din Legea nr. 98/2016, autoritatea contractanta va solicita ofertantului, o singura data, sa inlocuiasca un subcontractant in legatura cu care a rezultat, in urma verificarii,ca se afla in aceasta situatie. Nota 2: In cazul in care operatorul economic intentioneaza sa subcontracteze o parte/parti din contract, DUAE include si informatiile solicitate cu privire la subcontractanti; 2) In cazul in care ofertantii participanti la procedura invoca sustinerea tehnico-profesionala, tertul sustinator nu trebuie sa se incadreze in prevederile art. 60, 164, 165 si art. 167 din Legea nr. 98/2016. Nota 1: Daca tertul/tertii nu indeplineste/indeplinesc criteriile relevante privind capacitatea, autoritatea contractanta solicita, o singura data, ca operatorul economic sa inlocuiasca tertul/tertii sustinator/sustinatori. 5) La momentul depunerii ofertei, operatorii economici participanti vor trebui sa declare/confirme prin DUAE indeplinirea cerintei de calificare. Avand in vedere prevederile art. 196 din Legea 98/2016, pentru a asigura desfasurarea corespunzatoare a procedurii, autoritatea contractanta va solicita operatorilor economici clasati pe primele 2 locuri in clasamentul intermediar, dupa aplicarea criteriului de atribuire, sa prezinte CV-uri, copii dupa diplomele de studii absolvite, cursurilor de formare profesionala/perfectionare urmate, precum si o fisa a postului in care sunt mentionate atributiile fiecaruia — in copie lizibila cu mentiunea conform cu originalul (scanate). Se va tine seama de cerintele privitoare la numarul de personal pe fiecare locatie in parte; 6) La momentul depunerii ofertei, operatorii economici participanti vor trebui sa declare/confirme prin DUAE indeplinirea cerintei de calificare. Documentele justificative care probeaza indeplinirea cerintei de calificare vor fi solicitate ofertantilor clasati pe primele 2 locuri in clasamentul intermediar, ca urmare a finalizarii evaluarii ofertelor. Se va avea in vedere, pentru indeplinirea cerintei, lista cu dotari minime de echipamente si consumabile din Caietul de sarcini cap. IV; 3) Se va atasa, odata cu DUAE, Formularul nr. 5 si Formularul nr. 6 — in original (scanat). Documentele justificative care probeaza cele asumate in angajamente/acorduri vor fi solicitate ofertantilor clasati pe primele 2 locuri in clasamentul intermediar intocmit la finalizarea evaluarii ofertelor; 1) Se va completa Documentul unic de achizitie European (DUAE), cu privire la experienta similara. Avand in vedere prevederile art. 196 din Legea 98/2016, pentru a asigura desfasurarea corespunzatoare a procedurii, autoritatea contractanta va solicita operatorilor economici clasati pe primele 2 locuri in clasamentul intermediar, dupa aplicarea criteriului de atribuire, sa prezinte documente/certificate/contracte care sa releve faptul, ca in ultimii 3 ani, au mai prestat servicii similare (servicii de curatenie si dezinfectie la nivelul unitatilor spiatalicesti); 4) Se va atasa, odata cu DUAE, Formularul nr. 4 — in original (scanat). Documentele justificative care probeaza cele asumate in angajamente/acorduri vor fi solicitate dofertantilor clasati pe primele 2 locuri in clasamentul intermediar, ca urmare a finalizarii evaluarii ofertelor; 2) Se va atasa, odata cu DUAE, Formularul nr. 8 — în original (scanat), impreuna cu documentele anexe la angajament, transmise acestora de catre tertul sustinator, din care rezulta modul efectiv in care se va materializa sustinerea acestuia. Documentele justificative (certificate/documente care probeaza cele asumate in angajamente) vor fi solicitate ofertantilor clasati pe primele 2 locuri in clasamentul intermediar intocmit la finalizarea evaluarii ofertelor. 2 2018-10-12 15:00 2019-02-12 2018-10-12 15:00 In SEAP. Comisia de evaluare si expertii externi cooptati (daca este cazul). 1) Daca se depun doua sau mai multe oferte cu pret egal si totodata cel mai mic pret din totalul ofertelor depuse, departajarea ofertelor (cu pret egal) se va face prin depunerea de catre ofertantii aflati in aceasta situatie a unor noi propuneri financiare; 2) Documentul Unic de Achizitii European se va putea accesa, in vederea completarii de catre operatorii economici interesati, la adresa: https://ec.europa.eu/growth/tools databases/espd/filter 3) Pentru vizualizarea documentatiei de atribuire incarcate in SEAP, operatorii economici trebuie sa aiba un program necesar vizualizarii fisierelor semnate electronic (site-urile furnizorilor de semnatura electronica) conform art. 60 alin. (4) din H.G. 395/2016; 4) Reguli de comunicare si transmitere a datelor: Solicitarile de clarificari referitoare la prezenta documentatie de atribuire se vor adresa in mod exclusiv In SEAP, la Sectiunea Intrebari din cadrul procedurii de atribuire derulate prin mijloace electronice, iar raspunsurile la acestea vor fi publicate in SEAP, atat la Sectiunea Intrebari, cat si la Sectiunea Documentatie, clarificari si decizii din cadrul anuntului de participare, autoritatea contractanta urmand sa nu dea curs solicitarilor adresate prin alta modalitate de comunicare decat cea stabilita in conformitate cu prevederile art. 64 alin. (1) din Legea nr. 98/2016 privind achizitiile publice. — Pentru transmiterea solicitarilor de clarificari privind documentatia de atribuire, operatorii economici se vor inregistra in SEAP (www.elicitatie.ro) ca operator economic si ca participant la procedura de atribuire, — Pentru comunicarile ulterioare depunerii ofertelor: Comisia de evaluare va transmite solicitarile de clarificare in legatura cu oferta prin utilizarea facilitatilor tehnice disponibile in SEAP (Sectiunea „Intrebari”), — Operatorii economici vor transmite raspunsurile la clarificari si eventualele documente solicitate pe parcursul evaluarii ofertelor prin intermediul SEAP. Consiliul Național de Soluționare a Contestațiilor Str. Stavropoleos nr. 6, sector 3 București 030084 +40 213104641 office@cnsc.ro +40 213104642 / +40 218900745 http://www.cnsc.ro Termenele de exercitare a cailor de atac sunt cele prevazute la art. 8 din Legea nr. 101/2016. Spitalul Clinic Județean de Urgență Arad Str. Andreny Karoly nr. 2–4 Arad 310037 +40 257211233 scjuarad.bap@gmail.com +40 257211233 www.scjarad.ro 2018-09-07
              New comment on Item for Geeklist "Unwanted Rejects - Thrift/Bargain Finds Left Behind 2018"       Cache   Translate Page      

    by hexahedron

    Related Item: Stack'em

    Kaffedrake wrote:

    I guess they said you can't add a version without an image? I do care about fleshing out the database, but there's not much I can do. :/


    First I was making versions for the ones that already had images uploaded without matching versions to associate them with. Rejected because visibly-distinct "previously uploaded images" didn't convince that guy that they actually existed, I don't know, seemed to not make much sense/consistency to me, but I don't care to argue about it and sure don't want any more trouble from it. I have health issues and a constant stream of other stressful things going on, so I don't deal with stuff like this very well, and certainly feel less enthusiastic about pursuing submissions in the future. I'm trying to stick to just one submission at a time from now on (and just for things that matter to me more than this lame game did), so I don't risk ending up with as much wasted effort and a pile of rejections in my inbox to dread looking at. :blush:

    Don't let my whining discourage anyone else's pursuit of new data; I applaud your efforts. :)
              Ecommerce Channel Associate | High-Tech Equipment - TechEQ, LLC - Phoenix, AZ      Cache   Translate Page      
    Listing new and used equipment to our database and ecommerce sites. We buy, sell and trade a wide range of high-tech equipment, including test equipment,... $13 - $16 an hour
    From Indeed - Tue, 07 Aug 2018 00:28:00 GMT - View all Phoenix, AZ jobs
              New comment on Item for Geeklist "Unwanted Rejects - Thrift/Bargain Finds Left Behind 2018"       Cache   Translate Page      

    by Kaffedrake

    Related Item: Stack'em

    I guess they said you can't add a version without an image? I do care about fleshing out the database, but there's not much I can do. :/
              Offer - http://allsupplement4u.com/raspberry-ketone-max-uk/ - UK      Cache   Translate Page      
    Raspberry Ketone Max When you think about it, often times we find particular triggers that generate these powerful yearnings for certain kinds of food. If you do not have this information, there are online databases that can give you an estimate. Cheryl Forberg, who served as the nutritionist for the Biggest Loser for 12 seasons, is a James Beard Award-winning chef, a registered dietitian and has worked on 13 cookbooks that focus on creating healthy and tasty meals. The joint will then be able to painlessly bear weight, but will have no flexibility.About weight:- http://allsupplement4u.com/raspberry-ketone-max-uk/
              QA Inspector - ICS/QAD/QAI/ES - ST Electronics (Info-comm Systems) Pte Ltd - Ang Mo Kio      Cache   Translate Page      
    Posting of inspection results to SAP &amp; IQC database. Perform IQC inspection &amp; testing of incoming parts....
    From Singapore Technologies Electronics - Fri, 06 Jul 2018 06:27:56 GMT - View all Ang Mo Kio jobs
              Microsoft SQL Server Database Administrator (DBA) - Base Camp Data Soutions - Hyderabad      Cache   Translate Page      
    Large data storage solution management. Knowledge of indexes, index management, and statistics. Experience with data management and data processing flowcharting...
    From Indeed - Mon, 27 Aug 2018 10:55:31 GMT - View all Hyderabad jobs
              Introduction to python web scraping and the Beautiful Soup library      Cache   Translate Page      
    https://linuxconfig.org/introduction-to-python-web-scraping-and-the-beautiful-soup-library

    Objective

    Learning how to extract information out of an html page using python and the Beautiful Soup library.

    Requirements

    • Understanding of the basics of python and object oriented programming

    Difficulty

    EASY

    Conventions

    • # - requires given linux command to be executed with root privileges either directly as a root user or by use of sudo command
    • $ - given linux command to be executed as a regular non-privileged user

    Introduction

    Web scraping is a technique which consist in the extraction of data from a web site through the use of dedicated software. In this tutorial we will see how to perform a basic web scraping using python and the Beautiful Soup library. We will use python3 targeting the homepage of Rotten Tomatoes, the famous aggregator of reviews and news for films and tv shows, as a source of information for our exercise.

    Installation of the Beautiful Soup library

    To perform our scraping we will make use of the Beautiful Soup python library, therefore the first thing we need to do is to install it. The library is available in the repositories of all the major GNU\Linux distributions, therefore we can install it using our favorite package manager, or by using pip, the python native way for installing packages.

    If the use of the distribution package manager is preferred and we are using Fedora:
    $ sudo dnf install python3-beautifulsoup4
    On Debian and its derivatives the package is called beautifulsoup4:
    $ sudo apt-get install beautifulsoup4
    On Archilinux we can install it via pacman:
    $ sudo pacman -S python-beatufilusoup4
    If we want to use pip, instead, we can just run:
    $ pip3 install --user BeautifulSoup4
    By running the command above with the --user flag, we will install the latest version of the Beautiful Soup library only for our user, therefore no root permissions needed. Of course you can decide to use pip to install the package globally, but personally I tend to prefer per-user installations when not using the distribution package manager.

    The BeautifulSoup object

    Let's begin: the first thing we want to do is to create a BeautifulSoup object. The BeautifulSoup constructor accepts either a string or a file handle as its first argument. The latter is what interests us: we have the url of the page we want to scrape, therefore we will use the urlopen method of the urllib.request library (installed by default): this method returns a file-like object:

    from bs4 import BeautifulSoup
    from urllib.request import urlopen

    with urlopen('http://www.rottentomatoes.com') as homepage:
    soup = BeautifulSoup(homepage)
    At this point, our soup it's ready: the soup object represents the document in its entirety. We can begin navigating it and extracting the data we want using the built-in methods and properties. For example, say we want to extract all the links contained in the page: we know that links are represented by the a tag in html and the actual link is contained in the href attribute of the tag, so we can use the find_all method of the object we just built to accomplish our task:

    for link in soup.find_all('a'):
    print(link.get('href'))
    By using the find_all method and specifying a as the first argument, which is the name of the tag, we searched for all links in the page. For each link we then retrieved and printed the value of the href attribute. In BeautifulSoup the attributes of an element are stored into a dictionary, therefore retrieving them is very easy. In this case we used the get method, but we could have accessed the value of the href attribute even with the following syntax: link['href']. The complete attributes dictionary itself is contained in the attrs property of the element. The code above will produce the following result:
    [...]
    https://editorial.rottentomatoes.com/
    https://editorial.rottentomatoes.com/24-frames/
    https://editorial.rottentomatoes.com/binge-guide/
    https://editorial.rottentomatoes.com/box-office-guru/
    https://editorial.rottentomatoes.com/critics-consensus/
    https://editorial.rottentomatoes.com/five-favorite-films/
    https://editorial.rottentomatoes.com/now-streaming/
    https://editorial.rottentomatoes.com/parental-guidance/
    https://editorial.rottentomatoes.com/red-carpet-roundup/
    https://editorial.rottentomatoes.com/rt-on-dvd/
    https://editorial.rottentomatoes.com/the-simpsons-decade/
    https://editorial.rottentomatoes.com/sub-cult/
    https://editorial.rottentomatoes.com/tech-talk/
    https://editorial.rottentomatoes.com/total-recall/
    [...]
    The list is much longer: the above is just an extract of the output, but gives you an idea. The find_all method returns all Tag objects that matches the specified filter. In our case we just specified the name of the tag which should be matched, and no other criteria, so all links are returned: we will see in a moment how to further restrict our search.

    A test case: retrieving all "Top box office" titles

    Let's perform a more restricted scraping. Say we want to retrieve all the titles of the movies which appear in the "Top Box Office" section of Rotten Tomatoes homepage. The first thing we want to do is to analyze the page html for that section: doing so, we can observe that the element we need are all contained inside a table element with the "Top-Box-Office" id:

    Top Box Office
    Top Box Office
    We can also observe that each row of the table holds information about a movie: the title's scores are contained as text inside a span element with class "tMeterScore" inside the first cell of the row, while the string representing the title of the movie is contained in the second cell, as the text of the a tag. Finally, the last cell contains a link with the text that represents the box office results of the film. With those references, we can easily retrieve all the data we want:

    from bs4 import BeautifulSoup
    from urllib.request import urlopen

    with urlopen('https://www.rottentomatoes.com') as homepage:
    soup = BeautifulSoup(homepage.read(), 'html.parser')

    # first we use the find method to retrieve the table with 'Top-Box-Office' id
    top_box_office_table = soup.find('table', {'id': 'Top-Box-Office'})

    # than we iterate over each row and extract movies information
    for row in top_box_office_table.find_all('tr'):
    cells = row.find_all('td')
    title = cells[1].find('a').get_text()
    money = cells[2].find('a').get_text()
    score = row.find('span', {'class': 'MeterScore'}).get_text()
    print('{0} -- {1} (TomatoMeter: {2})'.format(title, money, score))
    The code above will produce the following result:
    Crazy Rich Asians -- .9M (TomatoMeter: 93%)
    The Meg -- .9M (TomatoMeter: 46%)
    The Happytime Murders -- .6M (TomatoMeter: 22%)
    Mission: Impossible - Fallout -- .2M (TomatoMeter: 97%)
    Mile 22 -- .5M (TomatoMeter: 20%)
    Christopher Robin -- .4M (TomatoMeter: 70%)
    Alpha -- .1M (TomatoMeter: 83%)
    BlacKkKlansman -- .2M (TomatoMeter: 95%)
    Slender Man -- .9M (TomatoMeter: 7%)
    A.X.L. -- .8M (TomatoMeter: 29%)
    We introduced few new elements, let's see them. The first thing we have done, is to retrieve the table with 'Top-Box-Office' id, using the find method. This method works similarly to find_all, but while the latter returns a list which contains the matches found, or is empty if there are no correspondence, the former returns always the first result or None if an element with the specified criteria is not found.

    The first element provided to the find method is the name of the tag to be considered in the search, in this case table. As a second argument we passed a dictionary in which each key represents an attribute of the tag with its corresponding value. The key-value pairs provided in the dictionary represents the criteria that must be satisfied for our search to produce a match. In this case we searched for the id attribute with "Top-Box-Office" value. Notice that since each id must be unique in an html page, we could just have omitted the tag name and use this alternative syntax:

    top_box_office_table = soup.find(id='Top-Box-Office')
    Once we retrieved our table Tag object, we used the find_all method to find all the rows, and iterate over them. To retrieve the other elements, we used the same principles. We also used a new method, get_text: it returns just the text part contained in a tag, or if none is specified, in the entire page. For example, knowing that the movie score percentage are represented by the text contained in the span element with the tMeterScore class, we used the get_text method on the element to retrieve it.

    In this example we just displayed the retrieved data with a very simple formatting, but in a real-world scenario, we might have wanted to perform further manipulations, or store it in a database.

    Conclusions

    In this tutorial we just scratched the surface of what we can do using python and Beautiful Soup library to perform web scraping. The library contains a lot of methods you can use for a more refined search or to better navigate the page: for this I strongly recommend to consult the very well written official docs.

              Donations & Database Coordinator - Single Giving - The Fred Hollows Foundation - Tocantins      Cache   Translate Page      
    Alexandria location, Green Square station Work/life balance Data entry and customer service. A rare and exciting opportunity exists to join one of Australia’s...
    De The Fred Hollows Foundation - Fri, 07 Sep 2018 12:17:23 GMT - Visualizar todas as empregos: Tocantins
              3 open source log aggregation tools      Cache   Translate Page      
    https://opensource.com/article/18/9/open-source-log-aggregation-tools

    Log aggregation systems can help with troubleshooting and other tasks. Here are three top options.

    Image by : 
    opensource.com
    x

    Get the newsletter

    Join the 85,000 open source advocates who receive our giveaway alerts and article roundups.
    How is metrics aggregation different from log aggregation? Can’t logs include metrics? Can’t log aggregation systems do the same things as metrics aggregation systems?
    These are questions I hear often. I’ve also seen vendors pitching their log aggregation system as the solution to all observability problems. Log aggregation is a valuable tool, but it isn’t normally a good tool for time-series data.
    A couple of valuable features in a time-series metrics aggregation system are the regular interval and the storage system customized specifically for time-series data. The regular interval allows a user to derive real mathematical results consistently. If a log aggregation system is collecting metrics in a regular interval, it can potentially work the same way. However, the storage system isn’t optimized for the types of queries that are typical in a metrics aggregation system. These queries will take more resources and time to process using storage systems found in log aggregation tools.
    So, we know a log aggregation system is likely not suitable for time-series data, but what is it good for? A log aggregation system is a great place for collecting event data. These are irregular activities that are significant. An example might be access logs for a web service. These are significant because we want to know what is accessing our systems and when. Another example would be an application error condition—because it is not a normal operating condition, it might be valuable during troubleshooting.
    A handful of rules for logging:
    • DO include a timestamp
    • DO format in JSON
    • DON’T log insignificant events
    • DO log all application errors
    • MAYBE log warnings
    • DO turn on logging
    • DO write messages in a human-readable form
    • DON’T log informational data in production
    • DON’T log anything a human can’t read or react to

    Cloud costs

    When investigating log aggregation tools, the cloud might seem like an attractive option. However, it can come with significant costs. Logs represent a lot of data when aggregated across hundreds or thousands of hosts and applications. The ingestion, storage, and retrieval of that data are expensive in cloud-based systems.
    As a point of reference from a real system, a collection of around 500 nodes with a few hundred apps results in 200GB of log data per day. There’s probably room for improvement in that system, but even reducing it by half will cost nearly $10,000 per month in many SaaS offerings. This often includes retention of only 30 days, which isn’t very long if you want to look at trending data year-over-year.
    This isn’t to discourage the use of these systems, as they can be very valuable—especially for smaller organizations. The purpose is to point out that there could be significant costs, and it can be discouraging when they are realized. The rest of this article will focus on open source and commercial solutions that are self-hosted.

    Tool options

    ELK

    ELK, short for Elasticsearch, Logstash, and Kibana, is the most popular open source log aggregation tool on the market. It’s used by Netflix, Facebook, Microsoft, LinkedIn, and Cisco. The three components are all developed and maintained by Elastic. Elasticsearch is essentially a NoSQL, Lucene search engine implementation. Logstash is a log pipeline system that can ingest data, transform it, and load it into a store like Elasticsearch. Kibana is a visualization layer on top of Elasticsearch.
    A few years ago, Beats were introduced. Beats are data collectors. They simplify the process of shipping data to Logstash. Instead of needing to understand the proper syntax of each type of log, a user can install a Beat that will export NGINX logs or Envoy proxy logs properly so they can be used effectively within Elasticsearch.
    When installing a production-level ELK stack, a few other pieces might be included, like Kafka, Redis, and NGINX. Also, it is common to replace Logstash with Fluentd, which we’ll discuss later. This system can be complex to operate, which in its early days led to a lot of problems and complaints. These have largely been fixed, but it’s still a complex system, so you might not want to try it if you’re a smaller operation.
    That said, there are services available so you don’t have to worry about that. Logz.io will run it for you, but its list pricing is a little steep if you have a lot of data. Of course, you’re probably smaller and may not have a lot of data. If you can’t afford Logz.io, you could look at something like AWS Elasticsearch Service (ES). ES is a service Amazon Web Services (AWS) offers that makes it very easy to get Elasticsearch working quickly. It also has tooling to get all AWS logs into ES using Lambda and S3. This is a much cheaper option, but there is some management required and there are a few limitations.
    Elastic, the parent company of the stack, offers a more robust product that uses the open core model, which provides additional options around analytics tools, and reporting. It can also be hosted on Google Cloud Platform or AWS. This might be the best option, as this combination of tools and hosting platforms offers a cheaper solution than most SaaS options and still provides a lot of value. This system could effectively replace or give you the capability of a security information and event management (SIEM) system.
    The ELK stack also offers great visualization tools through Kibana, but it lacks an alerting function. Elastic provides alerting functionality within the paid X-Pack add-on, but there is nothing built in for the open source system. Yelp has created a solution to this problem, called ElastAlert, and there are probably others. This additional piece of software is fairly robust, but it increases the complexity of an already complex system.

    Graylog

    Graylog has recently risen in popularity, but it got its start when Lennart Koopmann created it back in 2010. A company was born with the same name two years later. Despite its increasing use, it still lags far behind the ELK stack. This also means it has fewer community-developed features, but it can use the same Beats that the ELK stack uses. Graylog has gained praise in the Go community with the introduction of the Graylog Collector Sidecar written in Go.
    Graylog uses Elasticsearch, MongoDB, and the Graylog Server under the hood. This makes it as complex to run as the ELK stack and maybe a little more. However, Graylog comes with alerting built into the open source version, as well as several other notable features like streaming, message rewriting, and geolocation.
    The streaming feature allows for data to be routed to specific Streams in real time while they are being processed. With this feature, a user can see all database errors in a single Stream and web server errors in a different Stream. Alerts can even be based on these Streams as new items are added or when a threshold is exceeded. Latency is probably one of the biggest issues with log aggregation systems, and Streams eliminate that issue in Graylog. As soon as the log comes in, it can be routed to other systems through a Stream without being processed fully.
    The message rewriting feature uses the open source rules engine Drools. This allows all incoming messages to be evaluated against a user-defined rules file enabling a message to be dropped (called Blacklisting), a field to be added or removed, or the message to be modified.
    The coolest feature might be Graylog’s geolocation capability, which supports plotting IP addresses on a map. This is a fairly common feature and is available in Kibana as well, but it adds a lot of value—especially if you want to use this as your SIEM system. The geolocation functionality is provided in the open source version of the system.
    Graylog, the company, charges for support on the open source version if you want it. It also offers an open core model for its Enterprise version that offers archiving, audit logging, and additional support. There aren’t many other options for support or hosting, so you’ll likely be on your own if you don’t use Graylog (the company).

    Fluentd

    Fluentd was developed at Treasure Data, and the CNCF has adopted it as an Incubating project. It was written in C and Ruby and is recommended by AWS and Google Cloud. Fluentd has become a common replacement for Logstash in many installations. It acts as a local aggregator to collect all node logs and send them off to central storage systems. It is not a log aggregation system.
    It uses a robust plugin system to provide quick and easy integrations with different data sources and data outputs. Since there are over 500 plugins available, most of your use cases should be covered. If they aren’t, this sounds like an opportunity to contribute back to the open source community.
    Fluentd is a common choice in Kubernetes environments due to its low memory requirements (just tens of megabytes) and its high throughput. In an environment like Kubernetes, where each pod has a Fluentd sidecar, memory consumption will increase linearly with each new pod created. Using Fluentd will drastically reduce your system utilization. This is becoming a common problem with tools developed in Java that are intended to run one per node where the memory overhead hasn’t been a major issue.

              How To View A Particular Package Installed/Updated/Upgraded/Removed/Erased Date On Linux      Cache   Translate Page      
    https://www.2daygeek.com/how-to-view-a-particular-package-installed-updated-upgraded-removed-erased-date-on-linux

    I can damn sure installing, updating and removing packages in Linux system is one of the routine activity for Linux administrator, also they need to push a security updates to Linux system when it requires.
    For this whole activity, package manager is playing the major role and we can’t perform all these action without a package manager.
    If you would like to know when the package has installed or updated or erased then you are in the right page to get the information.
    In this tutorial you will be learning about the package activity such as installed date, package updated date, package erased date, package removed date, and who had performed that action.
    All the package managers are doing the same work but their functionality is different compared with others. We had already written all of these in the past. If you would like to check these then go to the corresponding URL which is listed below.
    All the package managers are allowing us to install a new package, update a existing packages, remove un-wanted packages, erase obsolete packages, etc.,
    Below are the famous package managers for Linux.

    How To View Package Installed/Updated/Erased Date In CentOS/RHEL Systems

    RHEL and CentOS systems are using YUM package manager hence we can use the yum.log file and yum history command to get this information.
    YUM stands for Yellowdog Updater, Modified is an open-source command-line front-end package-management utility for RPM based systems such as Red Hat Enterprise Linux (RHEL) and CentOS.
    Yum is the primary tool for getting, installing, deleting, querying, and managing RPM packages from distribution repositories, as well as other third-party repositories.
    If you would like to check the package installed date, just run the following command format and change the package that you want to check. Here we are going to check the htop package installed date.
    # grep -i installed /var/log/yum.log | grep htop
    May 03 08:40:22 Installed: htop-1.0.3-1.el6.x86_64
    To view package updated date, just run the following command format.
    # grep -i updated /var/log/yum.log | grep java
    May 08 08:13:15 Updated: 1:java-1.8.0-openjdk-headless-1.8.0.171-3.b10.el6_9.x86_64
    May 08 08:13:15 Updated: 1:java-1.8.0-openjdk-1.8.0.171-3.b10.el6_9.x86_64
    To view package removed/erased date, just run the following command format.
    # grep -i erased: /var/log/yum.log | grep epel-release
    May 17 17:38:41 Erased: epel-release
    If you would like to see all together in the single output, just run the following command format.
    # grep "java" /var/log/yum.log
    Apr 19 03:47:53 Installed: tzdata-java-2018d-1.el6.noarch
    Apr 19 03:48:00 Installed: 1:java-1.8.0-openjdk-headless-1.8.0.161-3.b14.el6_9.x86_64
    Apr 19 03:48:00 Installed: 1:java-1.8.0-openjdk-1.8.0.161-3.b14.el6_9.x86_64
    May 08 08:13:15 Updated: 1:java-1.8.0-openjdk-headless-1.8.0.171-3.b10.el6_9.x86_64
    May 08 08:13:15 Updated: 1:java-1.8.0-openjdk-1.8.0.171-3.b10.el6_9.x86_64

    How To View Package Installed Date In CentOS/RHEL Systems

    Alternatively we can check the package latest installed date using rpm command.
    RPM stands for RPM Package Manager formerly known as Red Hat Package Manager is a powerful package management system for Red Hat Enterprise Linux (RHEL) as well as other Linux distribution such as Fedora, CentOS, and openSUSE. RPM maintains a database of installed packages and their files, so you can invoke powerful queries and verification’s on your system.
    To view the latest installed date of package, just run the following rpm command format.
    # rpm -qi nano | grep "Install Date"
    Install Date: Fri 03 Mar 2017 08:57:47 AM EST Build Host: c5b2.bsys.dev.centos.org
    Alternatively use rpm with qi option to view the latest installed date of package.
    # rpm -qa --last | grep htop
    htop-1.0.3-1.el6.x86_64 Thu 03 May 2018 08:40:22 AM EDT
    Alternatively use rpm with q option alone to view the latest installed date of package.
    # rpm -q epel-release --last
    epel-release-6-8.noarch Fri 18 May 2018 10:33:06 AM EDT

    How To View Package Installed/Updated/Erased Date In CentOS/RHEL Systems

    Also we can check the package installed or updated or removed or erased date using
    yum history command.
    Use yum history command, if you want to list what are the packages that has installed/updated/erased in the particular date.
    # yum history
    Loaded plugins: fastestmirror, security
    ID | Login user | Date and time | Action(s) | Altered
    -------------------------------------------------------------------------------
    27 | root | 2018-07-22 00:19 | Install | 1
    26 | root | 2018-07-20 00:24 | Install | 1
    25 | root | 2018-05-18 10:35 | Install | 1
    24 | root | 2018-05-18 10:33 | Install | 1
    23 | root | 2018-05-17 17:38 | Erase | 3
    22 | root | 2018-05-10 04:12 | Install | 1
    21 | root | 2018-05-09 05:25 | Erase | 2
    20 | root | 2018-05-09 05:24 | Install | 2
    19 | root | 2018-05-09 05:19 | Install | 1
    18 | root | 2018-05-09 05:08 | Install | 2
    17 | root | 2018-05-09 05:05 | Erase | 1
    16 | root | 2018-05-08 08:18 | Install | 3
    15 | root | 2018-05-08 08:17 | Install | 8
    14 | root | 2018-05-08 08:13 | Update | 2
    13 | root | 2018-05-08 08:12 | Install | 4
    12 | root | 2018-05-08 08:12 | Install | 2
    11 | root | 2018-05-03 08:44 | Install | 2
    10 | root | 2018-05-03 08:40 | Install | 1
    9 | root | 2018-04-26 12:30 | Install | 30
    8 | root | 2018-04-26 08:11 | Install | 69
    To view detailed information, just use the corresponding yum transaction ID.
    # yum history info 27
    Loaded plugins: fastestmirror, security
    Transaction ID : 27
    Begin time : Sun Jul 22 00:19:51 2018
    Begin rpmdb : 574:7545d911e1217a575a723f63b02dd71262f9ccbb
    End time : 00:19:52 2018 (1 seconds)
    End rpmdb : 575:0861abf520414edea27be5a28796827ff65d155a
    User : root
    Return-Code : Success
    Command Line : localinstall oracleasm-support-2.1.8-1.el6.x86_64.rpm
    Transaction performed with:
    Installed rpm-4.8.0-55.el6.x86_64 @anaconda-CentOS-201605220104.x86_64/6.8
    Installed yum-3.2.29-81.el6.centos.noarch @base
    Installed yum-metadata-parser-1.1.2-16.el6.x86_64 @anaconda-CentOS-201605220104.x86_64/6.8
    Installed yum-plugin-fastestmirror-1.1.30-40.el6.noarch @base
    Packages Altered:
    Install oracleasm-support-2.1.8-1.el6.x86_64 @/oracleasm-support-2.1.8-1.el6.x86_64
    history info

    How To View Package Installed/Updated/Upgraded/Erased Date In Ubuntu/Debian/LinuxMint Systems

    Debian based systems are using APT and APT-GET package manager hence we can use the history.log and dpkg.log file to get this information.
    If you would like to check the package installed date, just run the following command format and change the package that you want to check.
    $ grep -A 2 "Install: nano" /var/log/apt/history.log
    Install: nano:amd64 (2.8.6-3)
    End-Date: 2018-08-09 09:12:05
    If you would like to check who has performed the package installation, just run the following command format.
    $ grep -A 3 "apt install nano" /var/log/apt/history.log*
    /var/log/apt/history.log:Commandline: apt install nano
    /var/log/apt/history.log-Requested-By: daygeek (1000)
    /var/log/apt/history.log-Install: nano:amd64 (2.8.6-3)
    /var/log/apt/history.log-End-Date: 2018-08-09 09:12:05
    To view package removed/erased date, just run the following command format.
    $ grep -A 2 "Remove: nano" /var/log/apt/history.log
    Remove: nano:amd64 (2.8.6-3)
    End-Date: 2018-08-09 08:58:34

    How To View Package Installed/Updated/Upgraded/Erased Date In Ubuntu/Debian/LinuxMint Systems

    Alternatively we can check the package latest installed date using dpkg command.
    DPKG stands for Debian Package is a tool to install, build, remove and manage Debian packages, but unlike other package management systems, it cannot automatically download and install packages or their dependencies.
    $ grep -i "install\|installed\|half-installed" /var/log/dpkg.log | grep firefox
    2018-07-18 10:25:46 status half-installed firefox:amd64 60.0.2+build1-0ubuntu0.17.10.1
    2018-07-18 10:25:53 status half-installed firefox:amd64 60.0.2+build1-0ubuntu0.17.10.1
    2018-07-18 10:25:53 status half-installed firefox:amd64 60.0.2+build1-0ubuntu0.17.10.1
    2018-07-18 10:25:54 status installed firefox:amd64 61.0.1+build1-0ubuntu0.17.10.1
    2018-07-18 10:29:25 status half-installed firefox-locale-en:amd64 60.0.2+build1-0ubuntu0.17.10.1
    2018-07-18 10:29:25 status half-installed firefox-locale-en:amd64 60.0.2+build1-0ubuntu0.17.10.1
    2018-07-18 10:29:25 status installed firefox-locale-en:amd64 61.0.1+build1-0ubuntu0.17.10.1
    To view package upgraded/updated date, just run the following command format.
    $ zgrep "upgrade" /var/log/dpkg.log* | grep mutter
    /var/log/dpkg.log.8.gz:2017-12-05 16:06:42 upgrade gir1.2-mutter-1:amd64 3.26.1-2ubuntu1 3.26.2-0ubuntu0.1
    /var/log/dpkg.log.8.gz:2017-12-05 16:06:43 upgrade mutter-common:all 3.26.1-2ubuntu1 3.26.2-0ubuntu0.1
    /var/log/dpkg.l