EzDev.org

bucket

A bucket for your shell (like a set of registers, or a clipboard manager)


Use XSLT 1.0 to group XML elements into buckets, in order, based on some criteria

Say I had some XML that I wanted to convert to HTML. The XML is divided into ordered sections:

<?xml version="1.0" encoding="utf-8"?>
<root>
  <section attr="someCriteria">
    <h1>Title 1</h1>
    <p>paragraph 1-1</p>
    <p>paragraph 1-2</p>
  </section>
  <section attr="someOtherCriteria">
    <h3>Subtitle 2</h3>
    <ul>
      <li>list item 2-1</li>
      <li>list item 2-2</li>
      <li>list item 2-3</li>
      <li>list item 2-4</li>
    </ul>
  </section>
  <section attr="anotherSetOfCriteria">
    <warning>
      Warning: This product could kill you
    </warning>
  </section>
  <section attr="evenMoreCriteria">
    <disclaimer>
      You were warned
    </disclaimer>
  </section>
  <section attr="criteriaSupreme">
    <p>Copyright 1999-2011</p>
  </section>
</root>

I have several of these XML documents. I need to group and transform these sections based on criteria. There will be two different kinds of buckets.

  • So the first section will go in a bucket (e.g.<div class="FormatOne"></div>)
  • If the second section meets the criteria to qualify for the "FormatOne" bucket it will also go in this bucket
  • If the third section requires a different bucket (e.g.<div class="FormatTwo"></div>) then a new bucket is created and section contents are placed in this bucket
  • If the bucket for the fourth section requires "FormatOne" (which is different than the previous format) then a new bucket is created again and section contents are placed in this bucket
  • etc. Each section would go into the same bucket as the previous section if they are the same format. If not, a new bucket is created.

So for each document, depending on the logic for separating buckets, the document may end up like this:

<body>
  <div class="FormatOne">
    <h1>Title 1</h1>
    <p>paragraph 1-1</p>
    <p>paragraph 1-2</p>
    <h3>Subtitle 2</h3>
    <ul>
      <li>list item 2-1</li>
      <li>list item 2-2</li>
      <li>list item 2-3</li>
      <li>list item 2-4</li>
    </ul>
  </div>
  <div class="FormatTwo">
    <span class="warningText">
      Warning: This product could kill you
    </span>
  </div>
  <div class="FormatOne">
    <span class="disclaimerText"> You were warned</span>
    <p class="copyright">Copyright 1999-2011</p>
  </div>
</body>

this:

<body>
  <div class="FormatOne">
    <h1>Title 1</h1>
    <p>paragraph 1-1</p>
    <p>paragraph 1-2</p>
    <h3>Subtitle 2</h3>
  </div>
  <div class="FormatTwo">
    <ul>
      <li>list item 2-1</li>
      <li>list item 2-2</li>
      <li>list item 2-3</li>
      <li>list item 2-4</li>
    </ul>
  </div>
  <div class="FormatOne">
    <span class="warningText">
      Warning: This product could kill you
    </span>
    <span class="disclaimerText"> You were warned</span>
    <p class="copyright">Copyright 1999-2011</p>
  </div>
</body>

or even this:

<body>
  <div class="FormatOne">
    <h1>Title 1</h1>
    <p>paragraph 1-1</p>
    <p>paragraph 1-2</p>
    <h3>Subtitle 2</h3>
    <ul>
      <li>list item 2-1</li>
      <li>list item 2-2</li>
      <li>list item 2-3</li>
      <li>list item 2-4</li>
    </ul>
    <span class="warningText">
      Warning: This product could kill you
    </span>
    <span class="disclaimerText"> You were warned</span>
    <p class="copyright">Copyright 1999-2011</p>
  </div>
</body>

depending on how the sections are defined.

Is there a way to use an XSLT to perform this type of grouping magic?

Any help would be great. Thanks!


Source: (StackOverflow)

Create S3 bucket for a specific region

If I create an S3 bucket as follows:

    AmazonS3Config amazonS3Config = new AmazonS3Config
    {
        ServiceURL = "s3-eu-west-1.amazonaws.com"
    };
    AmazonS3Client amazonS3Client = new AmazonS3Client(myAccessKeyId, 
        mySecretAccessKey, amazonS3Config)

    PutBucketRequest request = new PutBucketRequest
    {
        BucketName = bucket.Name,
        BucketRegion = S3Region.EU
    };
    amazonS3Client.PutBucket(request); 

As you see I have clearly specified to create my bucket in EU region,
but when I go to AWS explorer, I can see my bucket available in all the regions.

What is the point of specifying bucket region if my bucket is always replicated in all the regions?
Can anyone please clarify?

Thank you!


Source: (StackOverflow)

List files on S3

I'm getting frustrated by not finding any good explanation on how to list all files in a S3 bucket.

I have this bucket with about 20 images on. All I want to do is to list them. Someone says "just use the S3.list-method". But without any special library there is no S3.list-method. I have a S3.get-method, which I dont get to work. Arggh, would appreciate if someone told me how to simply get an list of all files(filenames) from an S3 bucket.

val S3files = S3.get(bucketName: String, path: Option[String], prefix: Option[String], delimiter: Option[String])

returns an Future[Response]

I dont know how to use this S3.get. What would be the easiest way to list all files in my S3 bucket?

Answers much appreciated!


Source: (StackOverflow)

I am studing couchbase, can anyone exlain what exactly is bucket and vbucket?

I am studing couchbase now, I am really confused by the official description of the term 'bucket' and 'vbucket', can anybody explain what exactely a bucket or vbucket is ? what's the difference? Better to make some analogies and give some examples.


Source: (StackOverflow)

Change user ownership of s3fs mounted buckets

how can I modify the user:group ownership of a s3fs mounted bucket?

I have a git installation that I would essentially like to store on my Amazon S3 account in a bucket, and then using Sparkleshare, via my web host, sync this data accross multiple machines.

- I Have set up the sparkleshare to successfully sync three machines. Works like a charm.

  • This is syncing to a folder at /home/git/dropbox No problems there.
  • I want the sync folder to me a mounted S3 bucket though
  • I can mount the buckets right next to that dropbox folder, but no love changing ownership to git:git

Problem: when you create the mount with root:root user, only that user has access to the bucket.

I tried to create the mount with S3FS logged in as the GIT user, but no luck, it still mounts and assigns permissions as the root:root user.

Do I uninstall S3FS and re-install using the GIT user?

Any help would be greatly appreciated!

Rick


Source: (StackOverflow)

Is it possible to share a Amazon S3 bucket between Amazon S3 users? [closed]

Is it possible to share a bucket between some S3 users ?

I've a S3 account for the user "me@myself.com" who can manage the bucket "my_bucket". Can I share this bucket with the S3 users "you@yourself.com" and "youtoo@yourself.com" ? i.e. They log in their S3 account and see my bucket ?

If not, is there any way to do this ? With bucket policy ?

I'm confused... thank you for your answer.

Fro_oo


Source: (StackOverflow)

Multiple Couchbase bucket configuration in .NET

I have 2 buckets in Couchbase one is Couchbase type and the other is Memcachced type: when I run my test I get an error: The element servers may only appear once in this section. Below is my config:

  <couchbase>
    <servers bucket="RepositoryCache" bucketPassword="">
      <add uri="http://127.0.0.1:8091/pools/default"/>
    </servers>

    <servers bucket="default" bucketPassword="">
      <add uri="http://127.0.0.1:8091/pools/default"/>
    </servers>
  </couchbase>

 How to configure multiple buckets and resolve the issue? I hv read the manual and I could not find much help.

Source: (StackOverflow)

Leaky bucket problem help?

I'm trying to review for my final and I'm going over example problems given to me by my professor. Can anyone explain to me the concept of how leaky bucket works. Also Here's a review problem my professor gave to me about leaky buckets.

A leaky bucket is at the host network interface. The data rate in the network is 2 Mbyte/s and the data rate from the application to the bucket is 2m5 Mbyte/s

A.) Suppose the host has 250 Mbytes to send onto the network and it sends the data in one burst. What should the minimum capacity of the bucket (in byte) in order that no data is lost?

B.) Suppose the capacity of the bucket is 100M bytes. What is the longest burst time from the host in order that no data is lost?


Source: (StackOverflow)

Why Doesn't My AWS S3 Bucket Policy Override My IAM Policy?

I have a user in my IAM account called "testuser" who has administrator privileges, like so:

{
  "Statement": [
    {
      "Effect": "Allow",
      "Action": "*",
      "Resource": "*"
    }
  ]
}

And then I have a policy on my S3 bucket that denies this user access, like so:

{
  "Statement": [
    {
  "Effect": "Deny",
  "Principal": {
    "AWS": "my-account-id:user/testuser"
  },
  "Action": "s3:*",
  "Resource": "arn:aws:s3:::my-bucket-name/*"
    }
  ]
}

So, the explicit deny in the S3 bucket policy should override the allow from the IAM policy right? But when I log in as testuser, I still have access to everything in that bucket - I even have access to change or remove the bucket policy for that bucket (and every other bucket too). Why isn't my explicit deny doing anything?


Source: (StackOverflow)

Can I display daily data in month buckets using only excel's chart formatting?

I have daily sales figures that I'd like to plot on a simple linegraph.

I would like them to be shown in monthly buckets (i.e. if I sold 5€ on Jan 01 and 10€ on Jan 24, I would like to see only one data point for January with 15€ in it).

Please note that I don't want to use any supporting formula/VBA script, I want to do this using only chart formatting.

I tried setting the chart's X-axis type to "date axis" and I chose "months" as the base unit. This almost works, but the line graph ends up being kind of weird. Changing the chart type to histogram doesn't help much either. The individual sales are not "piled up" like I would but, instead, they're hidden one behind the other. Stacked histogram doesn't work either.

Any clue on how I can force excel to bucketize my data using only chart formatting? This can't be that hard...


Source: (StackOverflow)

Delete object or bucket in Amazon S3?

I created a new amazon bucket called "photos". The bucket url is something like:

www.amazons3.salcaiser.com/photos

Now I upload subfolders containing files, into that bucket for example

www.amazons3.salcaiser.com/photos/thumbs/file.jpg

My questions are, does thumbs/ is assumed a new bucket or is it an object?

Then if I want to delete the entire thumbs/ directory need I first to delete all files inside that or can I delete all in one time?


Source: (StackOverflow)

Writing bucket sort in c++

A book I have says this:

a) Place each value of the one-dimensional array into a row of the bucket array based on the value's ones digit. For example, 97 is placed in row 7, 3 is placed in row 3, and 100 is placed in row 0. This is called a "distribution pass."

b) Loop through the bucket array row by row, and copy the values back to the original array. This is called a "gathering pass." The new order of the preceding values in the one-dimensional array is 100, 3, and 97.

c) Repeat this process for each subsequent digit position.

I am having a lot of trouble trying to understand and implement this. So far I have:

void b_sort(int sarray[], int array_size) {
    const int max = array_size;
    for(int i = 0; i < max; ++i)
        int array[i] = sarray[i];

    int bucket[10][max - 1];
}

I'm thinking that in order to sort them by ones, tens, hundreds, etc, I can use this:

for(int i = 0; i < max; ++i)
    insert = (array[i] / x) % 10;
    bucket[insert];

where x = 1, 10, 100, 1000, etc. I am totally lost on how to write this now.


Source: (StackOverflow)

s3 Policy has invalid action - s3:ListAllMyBuckets

I'm trying these policy through console.aws.amazon.com on my buckets:


    {
      "Statement": [
        {
          "Effect": "Allow",
          "Action": [
            "s3:ListBucket",
            "s3:GetBucketLocation",
            "s3:ListBucketMultipartUploads"
          ],
          "Resource": "arn:aws:s3:::itnighq",
          "Condition": {}
        },
        {
          "Effect": "Allow",
          "Action": [
            "s3:AbortMultipartUpload",
            "s3:DeleteObject",
            "s3:DeleteObjectVersion",
            "s3:GetObject",
            "s3:GetObjectAcl",
            "s3:GetObjectVersion",
            "s3:GetObjectVersionAcl",
            "s3:PutObject",
            "s3:PutObjectAcl",
            "s3:PutObjectAclVersion"
          ],
          "Resource": "arn:aws:s3:::itnighq/*",
          "Condition": {}
        },
        {
          "Effect": "Allow",
          "Action": "s3:ListAllMyBuckets",
          "Resource": "*",
          "Condition": {}
        }
      ]
    }

But I'm getting this error message: Policy has invalid action - s3:ListAllMyBuckets It doesn't seem to like "Resource": "" , I've also tried to use arn:aws:s3:::*, but it doesn't work either.

Anyone has any clue?


Source: (StackOverflow)

Hashcode bucket distribution in java

Suppose I need to store 1000 objects in Hashset, is it better that I have 1000 buckets containing each object( by generating unique value for hashcode for each object) or have 10 buckets roughly containing 100 objects?

1 advantage of having unique bucket is that I can save execution cycle on calling equals() method?

Why is it important to have set number of buckets and distribute the objects amoung them as evenly as possible?

What should be the ideal object to bucket ratio?


Source: (StackOverflow)

How do you rename a folder in a bucket on S3?

As simple as it sounds, it seems like an extraordinarily complicated task.


Source: (StackOverflow)